Automatic diagnosis of melanoma using hyperspectral data and GoogLeNet

Research output: Contribution to journalArticle

Abstract

Background: Melanoma is a type of superficial tumor. As advanced melanoma has a poor prognosis, early detection and therapy are essential to reduce melanoma-related deaths. To that end, there is a need to develop a quantitative method for diagnosing melanoma. This paper reports the development of such a diagnostic system using hyperspectral data (HSD) and a convolutional neural network, which is a type of machine learning. Materials and Methods: HSD were acquired using a hyperspectral imager, which is a type of spectrometer that can simultaneously capture information about wavelength and position. GoogLeNet pre-trained with Imagenet was used to model the convolutional neural network. As many CNNs (including GoogLeNet) have three input channels, the HSD (involving 84 channels) could not be input directly. For that reason, a “Mini Network” layer was added to reduce the number of channels from 84 to 3 just before the GoogLeNet input layer. In total, 619 lesions (including 278 melanoma lesions and 341 non-melanoma lesions) were used for training and evaluation of the network. Results and Conclusion: The system was evaluated by 5-fold cross-validation, and the results indicate sensitivity, specificity, and accuracy of 69.1%, 75.7%, and 72.7% without data augmentation, 72.3%, 81.2%, and 77.2% with data augmentation, respectively. In future work, it is intended to improve the Mini Network and to increase the number of lesions.

Details

Authors
Organisations
External organisations
  • Shinshu University Hospital
  • National Cancer Center Hospital, Japan
  • Waseda University
  • Kindai University
  • Shizuoka Cancer Center
Research areas and keywords

Subject classification (UKÄ) – MANDATORY

  • Cancer and Oncology
  • Dermatology and Venereal Diseases

Keywords

  • deep learning, GoogLeNet, hyperspectral imager, melanoma
Original languageEnglish
JournalSkin Research and Technology
Publication statusE-pub ahead of print - 2020 Jun 25
Publication categoryResearch
Peer-reviewedYes