This paper falls in the context of the interpretability of the internal structure of deep learning architectures. In particular, we propose an approach to map a Convolutional Neural Network (CNN) into a multilayer network. Next, to show how such a mapping helps to better understand the CNN, we propose a technique for compressing it. This technique detects if there are convolutional layers that can be removed without reducing the performance too much and, if so, removes them. In this way, we obtain lighter and faster CNN models that can be easily employed in any scenario.

Mapping and Compressing a Convolutional Neural Network through a Multilayer Network

Amelio A.
Primo
;
2022-01-01

Abstract

This paper falls in the context of the interpretability of the internal structure of deep learning architectures. In particular, we propose an approach to map a Convolutional Neural Network (CNN) into a multilayer network. Next, to show how such a mapping helps to better understand the CNN, we propose a technique for compressing it. This technique detects if there are convolutional layers that can be removed without reducing the performance too much and, if so, removes them. In this way, we obtain lighter and faster CNN models that can be easily employed in any scenario.
2022
CEUR Workshop Proceedings
Inglese
30th Italian Symposium on Advanced Database Systems, SEBD 2022
2022
Grand Hotel Continental, ita
3194
325
332
8
CEUR-WS
Convolutional Layer Pruning; Convolutional Neural Networks; Deep Learning; Multilayer Networks
no
none
Amelio, A.; Bonifazi, G.; Corradini, E.; Marchetti, M.; Ursino, D.; Virgili, L.
273
info:eu-repo/semantics/conferenceObject
6
4 Contributo in Atti di Convegno (Proceeding)::4.1 Contributo in Atti di convegno
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11564/799711
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact