Skip to content

Commit e323b92

Browse files
committed
added accepted paper on GSL and GDL for TS
1 parent 00cfc92 commit e323b92

File tree

2 files changed

+14
-11
lines changed

2 files changed

+14
-11
lines changed

_data/publications.yaml

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,8 +29,8 @@
2929
- title: 'Learning Latent Graph Structures and their Uncertainty'
3030
links:
3131
paper: https://arxiv.org/abs/2405.19933
32-
venue: preprint
33-
year: 2024
32+
venue: To appear in International Conference on Machine Learning
33+
year: 2025
3434
authors:
3535
- id:amanenti
3636
- id:dzambon
@@ -41,15 +41,11 @@
4141
- model calibration
4242
abstract: Within a prediction task, Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy. As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task. In this paper, we demonstrate that minimization of a point-prediction loss function, e.g., the mean absolute error, does not guarantee proper learning of the latent relational information and its associated uncertainty. Conversely, we prove that a suitable loss function on the stochastic model outputs simultaneously grants (i) the unknown adjacency matrix latent distribution and (ii) optimal performance on the prediction task. Finally, we propose a sampling-based method that solves this joint learning task. Empirical results validate our theoretical claims and demonstrate the effectiveness of the proposed approach.
4343
bibtex: >
44-
@misc{manenti2024learning,
44+
@inproceedings{manenti2025learning,
4545
title = {Learning {{Latent Graph Structures}} and Their {{Uncertainty}}},
4646
author = {Manenti, Alessandro and Zambon, Daniele and Alippi, Cesare},
47-
year = {2024},
48-
month = may,
49-
number = {arXiv:2405.19933},
50-
primaryclass = {cs, stat},
51-
publisher = {arXiv},
52-
archiveprefix = {arxiv}
47+
year = {2025},
48+
booktitle={To appear in Proceedings of the Forty-Second International Conference on Machine Learning (ICML)},
5349
}
5450
- title: 'Temporal Graph ODEs for Irregularly-Sampled Time Series'
5551
links:
@@ -120,8 +116,8 @@
120116
- title: Graph Deep Learning for Time Series Forecasting
121117
links:
122118
paper: https://arxiv.org/abs/2310.15978
123-
venue: Preprint
124-
year: 2023
119+
venue: To appear in ACM Computing Surveys
120+
year: 2025
125121
authors:
126122
- id:acini
127123
- id:imarisca
@@ -131,6 +127,13 @@
131127
- spatiotemporal graphs
132128
- forecasting
133129
abstract: Graph-based deep learning methods have become popular tools to process collections of correlated time series. Differently from traditional multivariate forecasting methods, neural graph-based predictors take advantage of pairwise relationships by conditioning forecasts on a (possibly dynamic) graph spanning the time series collection. The conditioning can take the form of an architectural inductive bias on the neural forecasting architecture, resulting in a family of deep learning models called spatiotemporal graph neural networks. Such relational inductive biases enable the training of global forecasting models on large time-series collections, while at the same time localizing predictions w.r.t. each element in the set (i.e., graph nodes) by accounting for local correlations among them (i.e., graph edges). Indeed, recent theoretical and practical advances in graph neural networks and deep learning for time series forecasting make the adoption of such processing frameworks appealing and timely. However, most of the studies in the literature focus on proposing variations of existing neural architectures by taking advantage of modern deep learning practices, while foundational and methodological aspects have not been subject to systematic investigation. To fill the gap, this paper aims to introduce a comprehensive methodological framework that formalizes the forecasting problem and provides design principles for graph-based predictive models and methods to assess their performance. At the same time, together with an overview of the field, we provide design guidelines, recommendations, and best practices, as well as an in-depth discussion of open challenges and future research directions.
130+
bibtex: >
131+
@article{cini2025graph,
132+
title = {Graph {{Deep Learning}} for {{Time Series Forecasting}}},
133+
author = {Cini, Andrea and Marisca, Ivan and Zambon, Daniele and Alippi, Cesare},
134+
year = {2025},
135+
journal={To appear in ACM Computing Surveys},
136+
}
134137
- title: Graph Representation Learning (special session at ESANN 2023)
135138
links:
136139
paper: https://doi.org/10.14428/esann/2023.ES2023-4

assets/img/people/DZ.jpg

-25.6 KB
Loading

0 commit comments

Comments
 (0)