Multi-target regression algorithms are designed to predict multiple outputs at the same time, and allow us to take all output variables into account during the training phase. Despite the recent advances, this context of machine learning is still an open challenge for developing a low-cost and high accurate algorithm. The main challenge in multi-target regression algorithms is how to use different targets’ information in the training and/or test phases. In this paper, we introduce a low-cost multi-target Gaussian process regression (GPR) algorithm, called joint GPR (JGPR) that employs a shared covariance matrix among the targets during the training phase and solves a sub-optimal cost function for optimization of hyperparameters. The proposed strategy reduces the computational complexity considerably during the training and test phases and simultaneously avoids overfitting of the multi-target regression algorithm upon the targets. We have performed extensive experiments on both simulated data and 18 benchmark datasets to assess the proposed method compared with other multi-target regression algorithms. Experimental results show that the proposed JGPR outperforms the state-of-the-art approaches on most of the given benchmark datasets.

Nabati, M., Ghorashi, S.A., Shahbazian, R. (2022). JGPR: a computationally efficient multi-target Gaussian process regression algorithm. MACHINE LEARNING, 111(6), 1987-2010 [10.1007/s10994-022-06170-3].

JGPR: a computationally efficient multi-target Gaussian process regression algorithm

Shahbazian R.
2022-05-11

Abstract

Multi-target regression algorithms are designed to predict multiple outputs at the same time, and allow us to take all output variables into account during the training phase. Despite the recent advances, this context of machine learning is still an open challenge for developing a low-cost and high accurate algorithm. The main challenge in multi-target regression algorithms is how to use different targets’ information in the training and/or test phases. In this paper, we introduce a low-cost multi-target Gaussian process regression (GPR) algorithm, called joint GPR (JGPR) that employs a shared covariance matrix among the targets during the training phase and solves a sub-optimal cost function for optimization of hyperparameters. The proposed strategy reduces the computational complexity considerably during the training and test phases and simultaneously avoids overfitting of the multi-target regression algorithm upon the targets. We have performed extensive experiments on both simulated data and 18 benchmark datasets to assess the proposed method compared with other multi-target regression algorithms. Experimental results show that the proposed JGPR outperforms the state-of-the-art approaches on most of the given benchmark datasets.
11-mag-2022
Nabati, M., Ghorashi, S.A., Shahbazian, R. (2022). JGPR: a computationally efficient multi-target Gaussian process regression algorithm. MACHINE LEARNING, 111(6), 1987-2010 [10.1007/s10994-022-06170-3].
File in questo prodotto:
File Dimensione Formato  
s10994-022-06170-3.pdf

accesso aperto

Tipologia: Versione Editoriale
Dimensione 1.97 MB
Formato Adobe PDF
1.97 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10447/696247
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? ND
social impact