ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.15090
17
2

New pyramidal hybrid textural and deep features based automatic skin cancer classification model: Ensemble DarkNet and textural feature extractor

28 March 2022
M. Baygin
T. Tuncer
S. Dogan
ArXivPDFHTML
Abstract

Background: Skin cancer is one of the widely seen cancer worldwide and automatic classification of skin cancer can be benefited dermatology clinics for an accurate diagnosis. Hence, a machine learning-based automatic skin cancer detection model must be developed. Material and Method: This research interests to overcome automatic skin cancer detection problem. A colored skin cancer image dataset is used. This dataset contains 3297 images with two classes. An automatic multilevel textural and deep features-based model is presented. Multilevel fuse feature generation using discrete wavelet transform (DWT), local phase quantization (LPQ), local binary pattern (LBP), pre-trained DarkNet19, and DarkNet53 are utilized to generate features of the skin cancer images, top 1000 features are selected threshold value-based neighborhood component analysis (NCA). The chosen top 1000 features are classified using the 10-fold cross-validation technique. Results: To obtain results, ten-fold cross-validation is used and 91.54% classification accuracy results are obtained by using the recommended pyramidal hybrid feature generator and NCA selector-based model. Further, various training and testing separation ratios (90:10, 80:20, 70:30, 60:40, 50:50) are used and the maximum classification rate is calculated as 95.74% using the 90:10 separation ratio. Conclusions: The findings and accuracies calculated are denoted that this model can be used in dermatology and pathology clinics to simplify the skin cancer detection process and help physicians.

View on arXiv
Comments on this paper