14 Citations (SciVal)


Deep learning has the potential to drastically increase the accuracy and efficiency of prostate cancer diagnosis, which would be of uttermost use. Today the diagnosis is determined manually from H&E stained specimens using a light microscope. In this paper several different approaches based on convolutional neural networks for prostate cancer classification are presented and compared, using three different datasets with different origins. The issue that algorithms trained on a certain site might not generalize to other sites, due to for example inevitable stain variations, is highlighted. Two different techniques to overcome this complication are compared; by training the networks using color augmentation and by using digital stain separation. Furthermore, the potential of using an autoencoder to get a more efficient downsampling is investigated, which turned out to be the method giving the best generalization. We achieve accuracies of 95% for classification of benign versus malignant tissue and 81% for Gleason grading for data from the same site as the training data. The corresponding accuracies for images from other sites are in average 88% and 52% respectively.

Original languageEnglish
Title of host publication2018 IEEE 15th International Symposium on Biomedical Imaging, ISBI 2018
PublisherIEEE Computer Society
Number of pages4
ISBN (Electronic)9781538636367
Publication statusPublished - 2018 May 23
Event15th IEEE International Symposium on Biomedical Imaging, ISBI 2018 - Washington, United States
Duration: 2018 Apr 42018 Apr 7


Conference15th IEEE International Symposium on Biomedical Imaging, ISBI 2018
Country/TerritoryUnited States

Subject classification (UKÄ)

  • Radiology, Nuclear Medicine and Medical Imaging


  • Autoencoder
  • Convolutional neural network
  • Digital stain separation
  • Gleason grade
  • Prostate cancer


Dive into the research topics of 'Generalization of prostate cancer classification for multiple sites using deep learning'. Together they form a unique fingerprint.

Cite this