Journal of Pathology Informatics Journal of Pathology Informatics
Contact us | Home | Login   |  Users Online: 7704  Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size 




 
Table of Contents    
TECHNICAL NOTE
J Pathol Inform 2021,  12:20

Remote reporting during a pandemic using digital pathology solution: Experience from a tertiary care cancer center


Department of Histopathology, Strand Life Sciences - Health Care Global Cancer Hospital, Bengaluru, Karnataka, India

Date of Submission09-Dec-2020
Date of Decision28-Dec-2020
Date of Acceptance01-Mar-2021
Date of Web Publication08-Apr-2021

Correspondence Address:
Dr. Veena Ramaswamy
Strand Life Sciences-Health Care Global Hospital, No. 08, Tower 01, P Kalinga Rao Road, Sampangiram Nagar, Bengaluru - 560 027, Karnataka
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/jpi.jpi_109_20

Rights and Permissions
   Abstract 


Background: Remote reporting in anatomic pathology is an important advantage of digital pathology that has not been much explored. The COVID-19 pandemic has provided an opportunity to explore this important application of digital pathology system in a tertiary care cancer center to ensure patient care and staff safety. Regulatory guidelines have been described for remote reporting following the pandemic. Herein, we describe our experience of validation of digital pathology workflow for remote reporting to encourage pathologists to utilize this facility which opens door for multiple, multidisciplinary collaborations. Objective: To demonstrate the validation and the operational feasibility of remote reporting using a digital pathology system. Materials and Methods: Our retrospective validation included whole-slide images (WSIs) of 60 cases of histopathology and 20 cases each of frozen sections and a digital image-based breast algorithm after a washout period of 3 months. Three pathologists with different models of consumer-grade laptops reviewed the cases remotely to assess the diagnostic concordance and operational feasibility of the modified workflow. The slides were digitized on a USFDA-approved Philips UFS 300 scanner at ×40 resolution (0.25 μm/pixel) and viewed on the Image Management System through a web browser. All the essential parameters were reported for each case. After successful validation, 886 cases were reported remotely from March 29, 2020, to June 30, 2020, prospectively. Light microscopy formed the gold standard reference in remote reporting. Results: 100% major diagnostic concordance was observed in the validation of remote reporting in the retrospective and prospective studies using consumer-grade laptops. The deferral rate was 0.34%. 97.6% of histopathology and 100% of frozen sections were signed out within the turnaround time. Network speed and a lack of virtual private network did not significantly affect the study. Conclusion: This study of validation and reporting of complete pathology cases remotely, including their operational feasibility during a public health emergency, proves that remote sign-out using a digital pathology system is not inferior to WSIs on medical-grade monitors and light microscopy. Such studies on remote reporting open the door for the use of digital pathology for interinstitutional consultation and collaboration: Its main intended use.

Keywords: Consumer-grade laptop, COVID-19, image management system, laboratory information system, local area network, quality control, remote reporting, virtual private network, whole-slide image


How to cite this article:
Ramaswamy V, Tejaswini B N, Uthaiah SB. Remote reporting during a pandemic using digital pathology solution: Experience from a tertiary care cancer center. J Pathol Inform 2021;12:20

How to cite this URL:
Ramaswamy V, Tejaswini B N, Uthaiah SB. Remote reporting during a pandemic using digital pathology solution: Experience from a tertiary care cancer center. J Pathol Inform [serial online] 2021 [cited 2021 Dec 5];12:20. Available from: https://www.jpathinformatics.org/text.asp?2021/12/1/20/313327




   Introduction Top


Digital pathology is the technology of converting a glass slide with tissue section on it into a digital image. This image can then be acquired, viewed, annotated, shared, networked, archived, and retrieved.[1] Used primarily in research and education, this technology provides access to digital images or whole-slide images (WSI) from anywhere across the globe, supporting collaboration and remote sign-out. USFDA approval of two digital pathology solutions for primary diagnosis has enabled their application in clinical diagnosis.[2],[3]

Our laboratory is attached to a large oncology hospital chain. Since March 2019, we have successfully transformed it into a 100% digital pathology laboratory for primary diagnosis in histopathology. The digital pathology deployment includes:

  1. A FDA-approved Philips UFS 300 (Ultrafast scanner 300) scanner with Image Management System (IMS) software
  2. Medical-grade Barco monitors; and
  3. A 150TB local server for archiving.


The hospital's intranet/local area network provided 600-800 mbps bandwidth with access to laboratory information system (LIS), picture archiving and communication system (PACS), and web browser.

The UFS 300 scans at ×40 equivalent resolution (0.25 μm/pixel) using an Olympus 340 Plan Apo objective with a numerical aperture of 0.75, producing an image that is 1.0–1.5 GB in size. A great advantage with this system is the unsupervised, high-throughput, walk-away fast scanning. WSIs are the primary mode of subspecialty reporting in the laboratory. Over 500 slides are generated and digitized daily from formalin-fixed, paraffin-embedded tissue including H&E stains, special stains, immunohistochemistry (IHCs), and frozen section slides. From accession of samples to sign-out of reports in the LIS, digital pathology is fully integrated into the workflow. Including patient demographics, test request form (TRFs) for clinical details, gross examination findings with section codes, and the gross specimen images, each folder on the IMS has all information necessary for reporting a case. During the COVID-19 pandemic, our pathologists worked from home on consumer-grade laptops. From March 25 to June 30, 2020, remote reporting of the entire day's cases was tried for the first time.

This study presents validation of the digital pathology system for remote reporting to support clinical workflow, ensuring safety of pathology staff. The aims included:

  1. Validation of a modified workflow and operational feasibility for remote reporting using digital pathology system
  2. Assessment of diagnostic concordance on consumer-grade monitors with medical-grade monitors and light microscope.



   Materials and Methods Top


Pre-COVID workflow

[Figure 1] describes the pre-COVID-19 workflow. Each sample accessioned in the laboratory was registered with a unique “H-No” (histopathology number). Gross examination of specimens was carried out by residents as per our institutionally defined protocols. Tissue cassettes were labeled with unique alphanumeric codes to represent the site of sampling. Tissues were processed on an overnight schedule of 12–14 h on two automated tissue processors: Leica ASP 300 and Thermo Fisher ELIXIR. They were then embedded on a Leica Biosystems embedding station EC 1150. Proper orientation of tissue in the middle of the block and even spread of the tissue in the center was ensured. Sections of 2–3 μ thickness were cut on Leica rotary microtomes RTM 2250 and 2150. The sections were then spread uniformly on 1.0 mm thick glass slides with no folds and uniform thickness. The tissue section was placed exactly in the center of the slide with no >2 smaller sections of tissue on one slide. After staining with H&E stain on Leica Biosystems Autostainer 5020, the slides were mounted on Sakura Tissue-Tek auto coverslip machine using film coverslip. 2D data matrix barcode labels were generated from the LIS and placed on the frosted end of the glass slide, with no overhanging edges. Prescan QC check included ensuring uniform slide edges, uniformly thin sections without tissue folds, no tissue out of coverslip, no air bubble, no markings, no overhanging coverslips, and no overhanging labels. Finally, the slides were loaded onto the UFS 300 scanner for overnight scanning.
Figure 1: Digital pathology workflow pre COVID-19 and during lockdown for remote reporting

Click here to view


IHC slides stained on Roche Ventana Benchmark XT and Biocare IntelliSite Pathology solutions were also digitized. Based on the details in the 2D data matrix barcode, all WSIs relating to a case were assembled in one folder on the IMS, tagged to respective organ systems, and assigned to the respective pathologist. The respective TRFs with clinical details, the description of gross specimens with section codes, and the images of gross specimens were uploaded, so that the entire information necessary for reporting of a case was available in the IMS folder.

Image quality check

Before the slides were loaded into the scanner, to avoid image artifacts, the digital pathology coordinator performed a macroevaluation as per the QC criteria. Both the coordinator and the reporting pathologist verified the quality of WSIs for each case. All outliers were documented. After appropriate root cause analysis and corrective action, rescanning was requested.

Report sign-out

The trainees (fellows and residents) first previewed the cases either on light microscopes or on the WSI. They then composed a preliminary report for each case in the report templates. Finally, they reviewed the case again with the respective subspecialist pathologist for final reporting. The reports were prepared in the form of free-texts, worksheet templates, and synoptic reports as College of American Pathologists (CAP) templates. They were then dictated and transcribed into the LIS and digitally signed out by the pathologist in charge. Cases that met the deferral criteria, such as those involving Helicobacter pylori or other microorganisms, or where the pathologist was not comfortable on WSI for the case, were reported on light microscopy (the term microscopy in this paper refers to traditional, analog bright field microscopy).

The average time to scan was 88 s, with an average scanned tissue size of 535 mm. The rescan rate was 0.51%. This was attributed mainly to barcode errors and large sections that were out of coverslip. In 0.27% of cases, because of fatty tissue, regions of interest (ROI) were missed. The deferral rate for microscopy was 0.28%. As a QC practice, every 25th case along with ten randomly picked up cases was reported again every month microscopically. Thus, 100% diagnostic concordance has been observed.

Workflow modification during COVID-19

[Figure 1] describes the modified workflow after the imposition of pandemic lockdowns, requiring pathologists to work from home. No modification was required in the prescan process, i.e., specimen accession, gross specimen examination, preparation of formalin-fixed and paraffin-embedded tissue blocks, staining process, and generation of stained slides with 2D barcodes. Frozen sections were handled in the same way, but with additional safety precautions as per the CDC guidelines. The postscan process was modified and validated. This included access to the digital pathology images from a remote location for reporting and also sign-out of reports in the LIS. Residents were included in the workflow in previewing cases, transcription in LIS, and verification of reports for typographical errors, before sign-out by pathologists.

Human resources

Technologists and trainees who stayed close to the hospital were instructed to be available on rotation. Among the four consultant pathologists, one was 67 years old and hence was requested to stay at home as per government instructions. The other three pathologists took turns to visit the hospital every 3rd day. These steps ensured social distancing with minimal staff present in the laboratory. Procedures for safe handling of specimens during the pandemic were also implemented in the department as per the updated guidelines[4],[5],[6],[7],[8],[9] Frozen sections were accepted only when necessary, following discussion with the surgeons, and only if they had a negative presurgical screening report of SARS-CoV-2 (reverse transcription polymerase chain reaction).

Personal computers

All the pathologists and residents in the department owned personal consumer-grade laptops. The monitor size varied from 12 to 15.6 inches (30.48–39.62 cm). All had access to the internet at their remote location through personal hotspots from data cards, routers, or 4G mobile phones hotspots that provided bandwidths ranging from 20 to 200 Mbps (megabits per second). Due to limited license for LIS installation, none had a virtual private network (VPN) setup on their personal laptops for access to the LIS.

The three pathologists had monitors with display resolutions ranging from 1280 × 800 pixels to 1440 × 900 pixels. Two owned MacBook Air laptops with built-in graphic cards with the following standard specifications: 13.3-inch diagonal LED-backlit display with IPS technology, 2560 × 1600 native resolution at 227-pixels per inch, and M1 chip with 8-core CPU, 7-core GPU, and 16-core-Neural engine. The third one owned a Dell model with a 14-inch monitor and 11th generation Intel® Core™ 15–1135G7 Processor with NVIDIA® GeForce® MX350 and 2GB GDDR5 graphics memory. They logged into the IMS through the secure URL on a web browser, using two-step authentication. They connected to their respective workstations through the “AnyDesk” application to access the LIS and PACS. Each use case was validated on personal laptops before implementation of remote reporting, as per the CAP[10] and Royal College of Pathologists (RCPath) guidelines.[11] The entire process was assessed for risks, following which risk mitigation steps were defined.

Modification of reporting

For easy reference, respective subspecialist pathologists were supplied with blank worksheet templates and synoptic templates for each organ system. Against each case, the pathologists entered their reports into the 'Reporting Grid' on the IMS. These were printed from the IMS by the trainees and transcribed into the LIS. The pathologists remotely connected to their workstations and signed out the reports in the LIS. Other communications, such as requests for additional sections, deeper or serial sections, special stains, and IHCs, were conveyed over a phone call or E-mail to the coordinator or indicated on the “Reporting Grid” in the IMS.

Validation

At the time of installation of the scanner, our initial retrospective validation study as per the CAP guidelines[10] had shown:

  1. 97.6% concordance with light microscopy for primary diagnosis in histopathology
  2. 96.2% concordance for cytopathology
  3. 100% concordance for frozen sections.


The prospective validation study as per the RCPath guidelines[12] had shown a concordance of 99.3% for primary diagnosis. Digital image analysis with Visiopharm breast algorithm had shown 100% concordance for ERPR and Her-2 and 97.2% concordance for Ki67. These were validated on site on medical-grade Barco monitors. However, when reporting from home, since different grades of monitors (rendering images with different resolutions) were used by each pathologist, digital image analysis on each screen needed to be validated.[11]

The following assumptions were made for the retrospective validation of remote reporting:

  1. The WSIs that are available for review are already reported on medical-grade monitors on site
  2. Are comparable to light microscopy.


Case assignment

Each subspecialist pathologist was assigned WSIs of 100 randomly selected retrospective cases. A washout period of 3 months was followed. The 60 cases included 40 small biopsies and 20 large radical resection specimens. There were 18 benign cases and 42 malignant cases. The turnaround time for a small biopsy was 2 days and for a large resection specimen was 4 days. Twelve cases had IHC slides along with H&E. Twenty cases each of frozen sections and breast biomarkers were also selected and assigned randomly. In all, there was a total of 2142 WSIs from 300 cases. The pathologists were unaware of the signed-out diagnosis of these cases.

Accessing cases and reporting

The pathologists accessed the cases through the secure URL on their web browser (usually Google Chrome), reviewed them, and entered their reports in the 'Reporting Grid' on the IMS for each case. The trainees downloaded these reports from the IMS and correlated them with the signed-out diagnosis. The pathologists had been given all the relevant data on the IMS including the TRFs and gross specimen images. The reporting metrics included all relevant information such as top-line diagnosis, margin status, lymphovascular and perineural invasion, mitosis count, lymph node status including extracapsular extension, pathological stage, and also ancillary testing where available.

Exclusion

Cytopathology cases and cases that met deferral criteria were excluded from the study.

Acceptance criteria

The acceptance criterion was a diagnostic concordance of >96%.[13],[14]

Results of retrospective validation

Operational feasibility

All pathologists were given a survey after the sign-out of each case. They had to mark each case over a scale of 0–5: “0” being unsatisfactory and “5” being very satisfactory [Chart 1].



On the operational feasibility of remote sign-out, all three pathologists concurred positively. They did not have to make any changes to their personal laptops for image analysis. They could easily access the secure URL through a web browser and observe good image quality and toolbars. They could access all the data in the folder and could easily navigate between folders and WSIs. They could access their workstations and sign out the reports in the LIS. While there was network lag on some occasions, it did not affect the turnaround time or image quality significantly. None of them felt the need for changing the illumination, sharpness, contrast, or color intensity on their consumer-grade laptops [Figure 2]. All the pathologists consented to practice of remote sign-out. They found it easy to upload of reports onto the 'Reporting Grid' on the IMS and to download reports from the IMS.
Figure 2: Images from laptop; (a and b) invasive breast carcinoma; (c) lymph node with mitosis marked by green circle and red arrow; (d) giant cell tumor of bone on laptop

Click here to view


Concordance rate

WSIs reported on site on medical-grade monitors and remotely on personal laptops showed 100% major intraobserver diagnostic concordance. This was true for all three pathologists.

Two minor discordances were observed – Case of Dermatofibroma reported on medical grade monitor was reported as Neurofibroma on consumer grade laptop; similarly, a case of pyogenic granuloma reported on medical grade monitor was reported as Granulation tissue on consumer grade laptop. These did not significantly affect the clinical management. These were reviewed on light microscopy and concurred on WSI diagnosis on medical-grade monitor. The three pathologists demonstrated 100% (60/60), 98.3% (59/60), and 98.3% (59/60) diagnostic concordance, with an average intraobserver diagnostic concordance of 98.9%. Frozen sections and breast algorithm showed 100% concordance each (20/20).

Report sign-out

The residents downloaded the reports from the “Reporting Grid” on the IMS and transcribed them into the LIS. The pathologists accessed their workstation through “AnyDesk” remotely and signed out the reports in the LIS. In all the cases (100%), the target turnaround time was met.

Consensus on retrospective validation

All three pathologists concurred positively on operational feasibility of remote reporting and expressed their willingness to remotely sign-out reports. As the workload during the pandemic was very low, we decided to implement remote sign-out during the 3 months of lockdown, in correlation with reading on medical-grade monitors and light microscopy.

Prospective validation – Experience “Going Live”

Pathologists visited the lab every 3rd day by rotation. The remote sign-out included all specimen types such as biopsies, large radical resection specimens, special stains, and IHC slides. As a part of prospective validation, all cases that were reported remotely were reported again on medical-grade monitors and light microscope on a visit to the laboratory. A provisional report was signed out remotely, subject to review on medical-grade monitors and microscopes.

Caseload and distribution

During the lockdown period, 998 specimens were registered. Of these, 886 cases were reported remotely and signed-out as “provisional reports.” 68 cases were reported on microscopy alone as they met our deferral criteria. Slides received in the remaining 44 cases (for secondary consultation from outside) were not digitized as they failed prescan QC, because of poor section and staining quality, handwritten labels, or overhanging coverslips. These had to be reported microscopically.

Of the 886 digitized cases, 576 cases were small biopsies such as needle biopsy or punch biopsy or curettage. 310 were large radical resection specimens. 195 were benign and 691 were malignant cases. A total of 8292 H&E and IHC glass slides were digitized. [Table 1] shows subspecialty-wise breakdown of cases. Breast cases formed the major load, followed by head and neck cases. Each case had three reads: laptop, medical-grade monitor, and light microscopy, with a total of 2658 reads as in [Table 2]. [Table 3] shows the case distribution for each pathologist.
Table 1: Subspecialty distribution of cases for remote reporting

Click here to view
Table 2: Distribution of biopsies and radical resection cases for remote reporting

Click here to view
Table 3: Subspecialty pathologists distribution of cases, reads, and concordance rates

Click here to view


Of the 12 frozen sections received during this period, 4 cases were of breast lumpectomy and 5 cases of hemimandibulectomy for margins. One case each of a solitary thyroid nodule, hysterectomy, and ovarian mass were received for diagnosis. The cases were reported remotely by the respective subspecialist.

Forty-three breast biomarkers (ER PR Her-2 and Ki67) were reported, using the breast algorithm remotely. Correlation with morphology, tumor grade and microscopy was done before sign-out.

Case assignment

After scanning, the slides were immediately made available in the IMS for reporting. The digital pathology coordinator then sent the pathologists a worklist of the day's entire cases through an E-mail or WhatsApp message. The TRFs with clinical details and gross examination findings (along with section codes and images of gross specimens) were uploaded in the IMS folder against each case [Figure 3]. The reporting template, with the trainees' observations, was also available in the “Reporting Grid.” The pathologists could also access radiology images on PACS through their workstations.
Figure 3: Images from Image Management System on laptop; (a) Test request form; (b) gross specimen image; (c) case assembly; (d) “Reporting Grid;” (e) breast algorithm; (f) PDF copy of report on Image Management System

Click here to view


Reporting and sign-out

Trainees previewed the cases prior to pathologists' review, either on the light microscope or by viewing WSIs on the medical-grade monitor. They uploaded the templates in the “Reporting Grid” for some cases. The pathologists then logged into the IMS through the secure URL on their web browser and independently reviewed all the cases. For each case, the findings were documented in the “Reporting Grid” on the IMS. The trainees then transcribed them into the LIS. Reporting in collaboration with trainees was done for some cases, when network speed was good. The pathologists accessed their workstation remotely through “AnyDesk” and signed out the reports in the LIS as 'provisional reports'. When the pathologists visited the laboratory on their rotation, they reviewed the WSIs again on medical-grade monitors and microscope and signed out a final report. The interval between remote sign-out and on-site review was a maximum of 2 days. Breast algorithm was correlated with on-site WSIs and light microscopy and reported.

Exclusion

  • Slides received from outside for secondary consultation were not digitized due to prescan QC failure. Instead, these were reported on light microscopy
  • Cytopathology cases were not reported remotely. CSF samples were reported by the pathologist on duty
  • Cases that met deferral criteria were reported microscopically.


Result of “Going-Live” (prospective validation)

For each case, the remote sign-out of the provisional report was compared with the on-site review on medical-grade monitors and light microscopy, and concordance metrics were captured. Further, as done for retrospective validation, an experience survey was conducted.

Diagnostic concordance

For all the cases, the major diagnostic concordance between remote reporting on laptops, light microscopy, and WSIs studied on site on medical-grade monitors was 100%. Only three minor discordances were observed, as in [Table 4]. The overall diagnostic concordance was 99.7%. Frozen section reports showed 100% concordance with paraffin sections. Breast algorithm also showed 100% concordance with WSIs on-site and light microscopy. The reads on WSI on medical-grade monitor and laptops also showed 100% diagnostic concordance.
Table 4: Discordant cases between reads on laptop, medical grade monitor, and light microscopy

Click here to view


The histological grade differed in three cases with a correlation between medical-grade monitor and light microscopy. Lymphovascular and perineural invasion was missed in two cases as in [Table 5]. Reasons include the wider screen available on medical-grade monitors and also possibly the pathologist's oversight.
Table 5: Concordance between different pathologic observations among three modalities

Click here to view


Deferral rate

Of the 886 cases reported remotely, 3 cases were deferred for microscopy. These included one case each of granulosa cell tumor of ovary, needle biopsy of Hodgkin lymphoma in a retroperitoneal node, and nodular lymphocyte predominant Hodgkin lymphoma. These accounted for 0.34% deferral of cases.

The pathologists collaborated online and consulted for opinions in four cases. They concurred on cases of serous carcinoma of the ovary with SET-like features, giant cell tenosynovitis (localized), and a case of Kikuchi's lymphadenitis. The fourth case, where a consensus was not reached (Type B1 thymoma vs. non-Hodgkin lymphoma), was reviewed on medical-grade monitor and light microscopy on a visit to the laboratory. A consensus opinion of Type B1 thymoma was signed out.

WSI provided better diagnosis in two cases [Table 6]. The diagnosis on medical-grade monitor and laptop screen were concordant, but did not match with light microscopy. A second look on microscopy confirmed the WSI diagnosis; the previous discordance was attributed to oversight by the pathologist while maneuvering the glass slide on the microscope.
Table 6: Discordant cases between reads on laptop, medical grade monitor and light microscopy

Click here to view


Turnaround time

Pathologists met the turnaround time, including for frozen sections, for 97.3% of the cases. The 2.7% of cases that were delayed either required additional sampling from the specimen or discussion in the virtual tumor board.

Rescan rate

As our workflow was validated and well monitored, the rescan rate during lockdown period was only 0.33%. It was mainly attributed to large sections out of coverslip and barcode errors. The ROI was missed in 0.17% of cases due to pure fatty tissue. These did not significantly affect the interpretation, as the macroimage option on the IMS helped in identifying lost areas on the slide. A second look on medical-grade monitors and light microscopy also showed no missed diagnosis correlating with macroimage view.

Network speed

Network speed, though slow on some days, did not significantly affect the image quality or turnaround time for reports.


   Discussion Top


This study presents retrospective and prospective validation of a digital pathology solution and its operational feasibility for remote reporting. Retrospective validation included establishing diagnostic concordance between WSIs on medical-grade monitors and consumer-grade laptops as well as validating modified workflow for remote sign-out. Prospective validation included monitoring the operational feasibility, training pathologists on consumer-grade laptops, as well as establishing diagnostic concordance between WSIs on laptops, medical-grade monitors, and microscopy. Our study successfully demonstrated operational feasibility of remote reporting with a 100% major diagnostic concordance between the three modalities.

CAP, Digital Pathology Association, and RCPath provide guidelines for the validation of digital pathology systems for primary diagnosis.[10],[12],[15] Several validation studies and noninferiority concordance studies from different institutions across the globe show good concordance between WSIs and light microscopy.[13],[16],[17],[18],[19],[20],[21],[22],[23] However, there are no formal studies comparing diagnostic concordance of WSIs on medical-grade monitors, consumer-grade laptops, and light microscopy. Studies comparing remote reporting with on-site reporting are also not available to our knowledge, except one recently conducted at Memorial Sloan-Kettering Cancer Center (MSKCC)[24] and one by Vodovnik et al.[25],[26],[27] Our study is similar to these studies, but in addition, shows the validation of operational feasibility and diagnostic concordance between the three modalities: remote reporting on laptops, on-site reporting on medical-grade monitors, and light microscopy.

Following the approval of remote sign-out by the Center for Medicare and Medicaid Services/Clinical Laboratory Improvement Amendments (CMS/CLIA) and USFDA approval of two digital pathology systems for primary diagnosis and on site reporting[1],[3],[28], pathologists are showing increasing acceptance of digital pathology for primary diagnosis. Although collaboration and remote reporting are the main advantages of digital pathology, they have not been widely explored and adopted. The COVID-19 pandemic has presented an opportunity for pathologists to review and report remotely, ensuring the safety of health-care personnel without compromising the continuity of patient care. Our study is one of the first of its kind that proves that digital pathology is an excellent tool for remote reporting.

Currently, there are no regulatory guidelines on the use of digital pathology systems in India. Our study proves the point that compared to on-site reporting (either on medical-grade monitors or light microscopy), remote reporting is not inferior.

Guidelines for working from home were provided by the RCPath in 2014.[29] Recently, the General Medical Council approved remote reporting in the United Kingdom,[30] and the CAP issued remote sign-out guidance in the United States[31],[32] to be used with appropriate validation in the wake of the COVID-19 emergency, to ensure the safety of health-care workers. Validation guidelines for remote reporting published recently by RCPath[11] emphasize the validation of computer monitors and training of pathologists.

CAP and RCPath guidelines on the validation of digital pathology systems state that each pathology laboratory should perform their own validation study, for each clinical use. In keeping with these guidelines, our validation included surgical pathology slides, frozen section slides, and IHC slides, including the breast algorithm from Visiopharm. Cases that met our deferral criteria, cases of cytopathology, and secondary consultation cases were excluded from the study. We also validated our modified postscan workflow of remote access to URL, diagnostic reporting on IMS, ordering of ancillary tests, and sign-out of reports in LIS remotely. These also included validation of vendor-agnostic features, involvement by trainees, and collaboration.

For validation, the CAP recommends a washout period of 2 weeks to avoid recall bias.[10] The WSIs selected for our retrospective validation had a washout period of 3 months. On the other hand, the washout period for prospective validation was just 2 days owing to the requirement of meeting the turnaround time for reports, as in the study at MSKCC.[24] The remote sign-out reports were available as “provisional reports,” which clinicians could view on the LIS. After review on-site, a final report was signed out.

Our retrospective validation study showed that major intraobserver diagnostic concordance between WSIs on laptops and medical-grade monitors was 100%. Prospective validation with all three modalities also showed major diagnostic concordance of 100%. This is in line with the Q-Probe study (which had a median of 5.1% discrepancy) and the MSKCC study. Our diagnostic concordance is also comparable to studies that show discordance of 4.9% and 3.6% for WSI with glass slides as the reference standard.[13],[14] Similar criteria were used in our study with <4% as acceptable discordance rate. The reference standard in our study was light microscopy for prospective validation. WSI on medical-grade monitor, that is comparable to light microscopy in our laboratory, formed the reference for retrospective validation. We had no major intraobserver discrepancy in both retrospective and prospective validation. There were two minor discordances in retrospective validation and three minor discordances in prospective validation, with overall intraobserver concordance rate of 98.9% and 99.7%, respectively, which is comparable to the study by Hanna et al. at MSKCC.[24]

Williams et al. in the RCPath guidelines[11] recommend validation of remote reporting, including the risk mitigation strategy on the consumer monitors, before implementation. For comfortable reporting on consumer-grade monitors, they recommend the use of 24-inch monitors, changing the display to sRGB and regular calibration. They observed that pathologists would find it easy to adapt to remote reporting if they are experienced in reporting on WSI and have been a part of the validation study. They emphasized the impact of digital slide quality and the reporting environment like ergonomic factors and natural light. By including pathologists in our initial validation study and providing them with a good validated workflow, good digital slide quality, and routine reporting on WSIs daily, we made adoption to remote reporting easy. To assess their comfort levels, we validated the WSIs on their personal laptops and did not feel the need for making any changes to the personal laptops.

Clarke et al.,[33],[34] in their studies, have described the procedure for validation of consumer monitors using point of use quality assurance (POUQA) strategy. Our study did not raise the need for validation of consumer-grade monitors.

Wright et al.[11],[35] recommend a maximum luminescence of 350 cd/m2 or more, a resolution of 3 megapixels or 24inches, and display curve gamma 2.2, or sRGB if a web browser is used. They also recommend quality assurance using the POUQA test. Our study showed 100% intraobserver major concordance between WSIs on Barco monitors and consumer-grade laptops without the need for any such change.

In a validation and concordance study by Hanna et al. at MSKCC,[24] 12 pathologists from nine surgical subspecialties participated in remote reporting during this public emergency. They reported 100% intraobserver major diagnostic concordance with glass slides, with an overall concordance of 98.8%. All pathologists used computer monitors of 13.3–42.2 inches, with a resolution of 1280 × 800 to 3840 × 2160 pixels, connecting to an institution workstation, through a secure VPN, without any change in their computer configuration. 108 cases were reported with a median WSI size of 1.3 GB, an average scan time of 90 s, and average tissue size of 612 mm2. Our study included 100 validation cases and 886 live cases, with three pathologists being a part of the study over a period of 12 weeks. Our median WSI size was 1.21 GB, with an average scan time of 83 s and an average tissue size of 553 mm2.

Vodovnik et al.[14],[26] in their study on validation of routine surgical pathology cases included autopsies, cytology, frozen sections, as well as remote sign-out. Diagnostic concordance for remote and on-site reporting was not performed. However, a network speed of 20 Mbps was concluded, to be an adequate speed for remote reporting. In our study, three pathologists used network speed of 20–200 mbps and did not face major technical problems.

Our validation study showed excellent concordance between WSIs on medical-grade monitors (Barco) that were available on site and WSIs on personal laptops used at remote locations. No adjustment of illumination, contrast, sharpness, or intensity was necessary. In addition, the excellent scanning technology on Philips UFS 300 with excellent images and ease of report upload on IMS were added advantages. The case details and gross specimen images on the IMS made reporting easier. Good consumer-grade laptops provided excellent technical support.

Our pathologists' experience in their respective subspecialty ranges from 5 to 19 years. They have been using digital pathology for primary diagnosis on-site for more than a year. All three opined positively on remote reporting and sign-out and also on the use of consumer-grade laptops, web browsers, and sign-out in the LIS. The two limitations faced by us, lack of VPN on laptops to access WSIs through contextual launch on our LIS and slow internet connectivity on some occasions, did not significantly influence the image quality or turnaround time for reports.


   Conclusion Top


The COVID-19 pandemic has struck the globe unexpectedly, making social distancing and work from home a norm. Digital pathology is an excellent technology, which is well integrated with the workflow. Along with a team approach, it proves that remote reporting and sign-out is noninferior to on-site reporting and is comparable to WSIs on medical-grade monitors and light microscopy. Such studies on remote reporting opens the door for the use of digital pathology for interinstitutional consultation and collaboration. Regulatory bodies have approved remote reporting and can refine guidelines for validation and user acceptability.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
   References Top

1.
Pantanowitz L, Parwani A. Digital Pathology. Chigaco: American Society for Clinical Pathology Press; 2017.  Back to cited text no. 1
    
2.
US Food & Drug Administration. FDA News Release: FDA Allows Marketing of First Whole Slide Imaging System for Digital Pathology; April 12, 2017. Available from: https://www.fda.gov/news-events/press-announcements/fda-allows-marketing-first-whole-slide-imaging-system-digital-pathology. [Last accessed on 2021 Feb 20].  Back to cited text no. 2
    
3.
Image Technology News. Leica Biosystems Receives FDA Clearance for Aperio AT2 DX Digital Pathology System; May 29, 2019. Available from: https://www.itnonline.com/content/leica-biosystems-receives-fda-clearance-aperio-at2-dx-digital-pathology-system (USFDA 510(K) clearance, accessedonline 20February 2021: https://www.accessdata.fda.gov/cdrh_docs/pdf19/K190332.pdf. [Last. [Last accessed on 2021 Feb 20].  Back to cited text no. 3
    
4.
Henwoor AF. Coronavirus disinfection in histopathology. J Histotechnol 2020;43:1-3. [doi: 10.1080/01478885.2020.1734718].  Back to cited text no. 4
    
5.
College of American Pathologists. Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology? Available from: https://www.cap.org/Laboratory-improvement/news-and-updates/cap-responds-to-your-covid-19-questions. [Last accessed on 2021 Feb 20].  Back to cited text no. 5
    
6.
Centres for Disease Control and Prevention. Information for Laboratories. Available from: https://www.cdc.gov/coronavirus/2019-nCOV/lab/. [Last accessed on 2021 Feb 20].  Back to cited text no. 6
    
7.
Centres for Disease Control and Prevention. Interim Laboratory Biosafety Guidelines for Handling and Processing Specimens Associated with Coronavirus Disease 2019 (COVID-19). Available from: https://www.cdc.gov/coronavirus/2019-nCoV/lab/lab-biosafety-guidelines.html. [Last accessed on 2021 Feb 20].  Back to cited text no. 7
    
8.
Wang W, Xu Y, Gao R, Lu R, Han K, Wu G, et al. Detection of SARS-CoV-2 in different types of clinical specimens. JAMA 2020;323:1843-4.  Back to cited text no. 8
    
9.
World Health Organization. Laboratory Biosafety Guidance Related to Coronavirus Disease 2019 (COVID-19): Interim Guidance; February 12, 2020. Available from: https://apps.who.int/iris/handle/10665/331138. [Last accessed on 2021 Feb 20].  Back to cited text no. 9
    
10.
Pantanowitz L, Sinard JH, Henricks WH, Fatheree LA, Carter AB, Contis L, et al. Validating whole slide imaging for diagnostic purposes in pathology: Guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch Pathol Lab Med 2013;137:1710-22.  Back to cited text no. 10
    
11.
Williams BJ, Brettle D, Aslam M, Barrett P, Bryson G, Cross S, et al. Guidance for remote reporting of digital pathology slides during periods of exceptional service pressure: An emergency response from the UK Royal College of Pathologists. J Pathol Inform 2020;11:12.  Back to cited text no. 11
    
12.
Royal College of Pathologists. Best Practice Recommendations for Implementing Digital Pathology; 2018. Available from: https://www.rcpath.org/uploads/assets/f465d1b3-797b-4297b7fedc00b4d77e51/Best-practice-recommendations-for-implementing-digital-pathology.pdf. [Last accessed on 2021 Mar 19].  Back to cited text no. 12
    
13.
Mukhopadhyay S, Feldman MD, Abels E, Ashfaq R, Beltaifa S, Cacciabeve NG, et al. Whole slide imaging versus microscopy for primary diagnosis in surgical pathology: A multicenter blinded randomized noninferiority study of 1992 cases (pivotal study). Am J Surg Pathol 2018;42:39-52.  Back to cited text no. 13
    
14.
Borowsky AD, Glassy EF, Wallace WD, Kallichanda NS, Behling CA, et al. Digital whole slide imaging compared with light microscopy for primary diagnosis in surgical pathology: A multi- center, double-blinded, randomized study of 2045 cases. Arch Pathol Lab Med 2020;144:1245-53.  Back to cited text no. 14
    
15.
Lowe A, Chlipala E, Elin J, Kawano Y, Long R, Tillman D. Validation of Digital Pathology in a Healthcare Environment. San Diego: Digital Pathology Association; 2011.  Back to cited text no. 15
    
16.
Baidoshvili A, Bucur A, van Leeuwen J, van der Laak J, Kluin P, van Diest PJ. Evaluating the benefits of digital pathology implementation: Time savings in laboratory logistics. Histopathology 2018;73:784-94.  Back to cited text no. 16
    
17.
Buck TP, Dilorio R, Havrilla L, O'Neill DG. Validation of a whole slide imaging system for primary diagnosis in surgical pathology: A community hospital experience. J Pathol Inform 2014;5:43.  Back to cited text no. 17
[PUBMED]  [Full text]  
18.
Campbell WS, Lele SM, West WW, Lazenby AJ, Smith LM, Hinrichs SH. Concordance between whole-slide imaging and light microscopy for routine surgical pathology. Hum Pathol 2012;43:1739-44.  Back to cited text no. 18
    
19.
Fraggetta F, Garozzo S, Zannoni GF, Pantanowitz L, Rossi ED. Routine digital pathology workflow: The Catania experience. J Pathol Inform 2017;8:51.  Back to cited text no. 19
[PUBMED]  [Full text]  
20.
Retamero JA, Aneiros-Fernandez J, Del Moral RG. Complete digital pathology for routine histopathology diagnosis in a multicenter hospital network. Arch Pathol Lab Med 2019;144:221-8.  Back to cited text no. 20
    
21.
Williams BJ, Treanor D. Practical guide to training and validation for primary diagnosis with digital pathology. J Clin Pathol 2020;73:418-22.  Back to cited text no. 21
    
22.
Williams BJ, Hanby A, Millican-Slater R, Nijhawan A, Verghese E, Treanor D. Digital pathology for the primary diagnosis of breast histopathological specimens: An innovative validation and concordance study on digital pathology validation and training. Histopathology 2018;72:662-71.  Back to cited text no. 22
    
23.
Williams BJ, Ismail A, Chakrabarty A, Treanor D. Clinical digital neuropathology: Experience and observations from a departmental digital pathology training programme, validation and deployment. J Clin Pathol 2020;jclinpath-2019-206343. [doi: 10.1136/jclinpath-2019-206343]. Ahead of print.  Back to cited text no. 23
    
24.
Hanna MG, Reuter VE, Ardon O, Kim D, Sirintrapun SJ, Schüffler PJ, et al. Validation of a digital pathology system including remote review during the COVID-19 pandemic. Mod Pathol 2020;33:2115-27.  Back to cited text no. 24
    
25.
Vodovnik A. Distance reporting in digital pathology: A study on 950 cases. J Pathol Inform 2015;6:18.  Back to cited text no. 25
[PUBMED]  [Full text]  
26.
Vodovnik A, Aghdam MR, Espedal DG. Remote autopsy services: A feasibility study on nine cases. J Telemed Telecare 2018;24:460-4.  Back to cited text no. 26
    
27.
Vodovnik A, Aghdam MR. Complete routine remote digital pathology services. J Pathol Inform 2018;9:36.  Back to cited text no. 27
[PUBMED]  [Full text]  
28.
Evans AJ, Bauer TW, Bui MM, Cornish TC, Duncan H, Glassy EF, et al. US food and drug administration approval of whole slide imaging for primary diagnosis: A key milestone is reached and new questions are raised. Arch Pathol Lab Med 2018;142:1383-7.  Back to cited text no. 28
    
29.
Lowe J, Trotter S. Royal College of Pathologists Guidelines on Working from Home. 3rd ed. 2 Carlton House Terrace, London, SW1Y5AF: The Royal College of Pathology; 2014. Available from: https://www.rcpath.org/ profession/guidelines/specialty specific publications.html. [Last accessed on 2018 Nov 19].  Back to cited text no. 29
    
30.
General Medical Council; Joint Statement: Supporting Doctors in the Event of a COVID-19 Epidemic in the UK. Available from: https://www.gmc-uk.org/news/news-archive/supporting-doctors-in-the-event-of-a-covid19-epidemic-in-the-uk. [Last accessed on 2021 Feb 21].  Back to cited text no. 30
    
31.
FindLaw.com. Court of Appeals of Texas, Houston (1st Dist.): Khoury V. Tomlinson, No. 01-16-00006-CV, March 30, 2017. Available from: https://caselaw.findlaw.com/tx-court-of-appeals/1854824.html. [Last accessed on 2021 Feb 20].  Back to cited text no. 31
    
32.
College of American Pathologists; COVID-19 Remote Sign-Out Guidance; 2020. Available from: https://documents.cap.org/documents/COVID19-Remote-Sign-Out-Guidance-vFNL.pdf. [Last accessed on 2021 Feb 20].  Back to cited text no. 32
    
33.
Clarke EL, Munnings C, Williams B, Brettle D, Treanor D. Display evaluation for primary diagnosis using digital pathology. J Med Imaging (Bellingham) 2020;7:027501.  Back to cited text no. 33
    
34.
Clarke EL, Brettle D, Sykes A, Wright A, Boden A, Treanor D. Development and evaluation of a novel point-of-use quality assurance tool for digital pathology. Arch Pathol Lab Med 2019;143:1246-55.  Back to cited text no. 34
    
35.
Wright AI, Clarke EL, Dunn CM, Williams BJ, Treanor DE, Brettle DS. A point-of-use quality assurance tool for digital pathology remote working. J Pathol Inform 2020;11:17.  Back to cited text no. 35
  [Full text]  


    Figures

  [Figure 1], [Figure 2], [Figure 3]
 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5], [Table 6]



 

 
Top
  

    

 
  Search
 
   Browse articles
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    Abstract
   Introduction
    Materials and Me...
   Discussion
   Conclusion
    References
    Article Figures
    Article Tables

 Article Access Statistics
    Viewed1898    
    Printed20    
    Emailed0    
    PDF Downloaded253    
    Comments [Add]    

Recommend this journal