We introduce a novel computational unit for neural networks that features multiple biases, challenging the traditional perceptron structure. This unit emphasizes the importance of preserving uncorrupted information as it is passed from one unit to the next, applying activation functions later in the process with specialized biases for each unit. Through both empirical and theoretical analyses, we show that by focusing on increasing biases rather than weights, there is potential for significant enhancement in a neural network model’s performance. This approach offers an alternative perspective on optimizing information flow within neural networks. See source code (CurioSAI in Increasing biases can be more efficient than increasing weights, 2023. https://github.com/CuriosAI/dac-dev).
Increasing biases can be more efficient than increasing weights
Metta, Carlo;Amato, Gianluca;Marchetti, Alessandro;Parton, Maurizio
;
2025-01-01
Abstract
We introduce a novel computational unit for neural networks that features multiple biases, challenging the traditional perceptron structure. This unit emphasizes the importance of preserving uncorrupted information as it is passed from one unit to the next, applying activation functions later in the process with specialized biases for each unit. Through both empirical and theoretical analyses, we show that by focusing on increasing biases rather than weights, there is potential for significant enhancement in a neural network model’s performance. This approach offers an alternative perspective on optimizing information flow within neural networks. See source code (CurioSAI in Increasing biases can be more efficient than increasing weights, 2023. https://github.com/CuriosAI/dac-dev).| File | Dimensione | Formato | |
|---|---|---|---|
|
s11634-025-00649-2.pdf
Solo gestori archivio
Descrizione: articolo
Tipologia:
PDF editoriale
Dimensione
1.96 MB
Formato
Adobe PDF
|
1.96 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


