Classification of surface defects in the steelworks industry plays a significant role in guaranteeing the quality of the products. From an industrial point of view, a serious concern is represented by the hot-rolled products shape defects and particularly those concerning the strip flatness. Flatness defects are typically divided into four sub-classes depending on which part of the strip is affected and the corresponding shape. In the context of this research, the primary objective is evaluating the improvements of exploiting the self-supervised learning paradigm for defects classification, taking advantage of unlabelled, real, steel strip flatness maps. Different pre-training methods are compared, as well as architectures, taking advantage of well-established neural subnetworks, such as Residual and Inception modules. A systematic approach in evaluating the different performances guarantees a formal verification of the self-supervised pre-training paradigms evaluated hereafter. In particular, pre-training neural networks with the EgoMotion meta-algorithm shows classification improvements over the AutoEncoder technique, which in turn is better performing than a Glorot weight initialization. © 2020, Universitas Ahmad Dahlan. All rights reserved.

Self-supervised pre-training of CNNs for flatness defect classification in the steelworks industry

Galli F;
2020

Abstract

Classification of surface defects in the steelworks industry plays a significant role in guaranteeing the quality of the products. From an industrial point of view, a serious concern is represented by the hot-rolled products shape defects and particularly those concerning the strip flatness. Flatness defects are typically divided into four sub-classes depending on which part of the strip is affected and the corresponding shape. In the context of this research, the primary objective is evaluating the improvements of exploiting the self-supervised learning paradigm for defects classification, taking advantage of unlabelled, real, steel strip flatness maps. Different pre-training methods are compared, as well as architectures, taking advantage of well-established neural subnetworks, such as Residual and Inception modules. A systematic approach in evaluating the different performances guarantees a formal verification of the self-supervised pre-training paradigms evaluated hereafter. In particular, pre-training neural networks with the EgoMotion meta-algorithm shows classification improvements over the AutoEncoder technique, which in turn is better performing than a Glorot weight initialization. © 2020, Universitas Ahmad Dahlan. All rights reserved.
2020
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
Self-supervision; Steelworks; Deep learning; CNN
File in questo prodotto:
File Dimensione Formato  
Self_supervised_pre_training_of_CNNs_for.pdf

accesso aperto

Tipologia: Published version
Licenza: Creative Commons
Dimensione 661.45 kB
Formato Adobe PDF
661.45 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11384/142465
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact