In this paper, we exploit the properties of mean absolute error (MAE) as a loss function for the deep neural network (DNN) based vector-to-vector regression. The goal of this work is two-fold: (i) presenting performance bounds of MAE, and (ii) demonstrating new properties of MAE that make it more appropriate than mean squared error (MSE) as a loss function for DNN based vector-to-vector regression. First, we show that a generalized upper-bound for DNN-based vector-to-vector regression can be ensured by leveraging the known Lipschitz continuity property of MAE. Next, we derive a new generalized upper bound in the presence of additive noise. Finally, in contrast to conventional MSE commonly adopted to approximate Gaussian errors for regression, we show that MAE can be interpreted as an error modeled by Laplacian distribution. Speech enhancement experiments are conducted to corroborate our proposed theorems and validate the performance advantages of MAE over MSE for DNN based regression.

Jun Qi, Jun Du, Sabato Marco Siniscalchi, Xiaoli Ma, Chin-Hui Lee (2020). On Mean Absolute Error for Deep Neural Network Based Vector-to-Vector Regression. IEEE SIGNAL PROCESSING LETTERS, 27, 1485-1489 [10.1109/lsp.2020.3016837].

On Mean Absolute Error for Deep Neural Network Based Vector-to-Vector Regression

Sabato Marco Siniscalchi
Investigation
;
2020-01-01

Abstract

In this paper, we exploit the properties of mean absolute error (MAE) as a loss function for the deep neural network (DNN) based vector-to-vector regression. The goal of this work is two-fold: (i) presenting performance bounds of MAE, and (ii) demonstrating new properties of MAE that make it more appropriate than mean squared error (MSE) as a loss function for DNN based vector-to-vector regression. First, we show that a generalized upper-bound for DNN-based vector-to-vector regression can be ensured by leveraging the known Lipschitz continuity property of MAE. Next, we derive a new generalized upper bound in the presence of additive noise. Finally, in contrast to conventional MSE commonly adopted to approximate Gaussian errors for regression, we show that MAE can be interpreted as an error modeled by Laplacian distribution. Speech enhancement experiments are conducted to corroborate our proposed theorems and validate the performance advantages of MAE over MSE for DNN based regression.
2020
Settore ING-INF/05 - Sistemi Di Elaborazione Delle Informazioni
Jun Qi, Jun Du, Sabato Marco Siniscalchi, Xiaoli Ma, Chin-Hui Lee (2020). On Mean Absolute Error for Deep Neural Network Based Vector-to-Vector Regression. IEEE SIGNAL PROCESSING LETTERS, 27, 1485-1489 [10.1109/lsp.2020.3016837].
File in questo prodotto:
File Dimensione Formato  
On_Mean_Absolute_Error_for_Deep_Neural_Network_Based_Vector-to-Vector_Regression.pdf

Solo gestori archvio

Tipologia: Versione Editoriale
Dimensione 283.98 kB
Formato Adobe PDF
283.98 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
2008.07281v1.pdf

accesso aperto

Tipologia: Pre-print
Dimensione 138.71 kB
Formato Adobe PDF
138.71 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/636625
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 167
  • ???jsp.display-item.citation.isi??? 127
social impact