|
29 | 29 | - title: 'Learning Latent Graph Structures and their Uncertainty' |
30 | 30 | links: |
31 | 31 | paper: https://arxiv.org/abs/2405.19933 |
32 | | - venue: preprint |
33 | | - year: 2024 |
| 32 | + venue: To appear in International Conference on Machine Learning |
| 33 | + year: 2025 |
34 | 34 | authors: |
35 | 35 | - id:amanenti |
36 | 36 | - id:dzambon |
|
41 | 41 | - model calibration |
42 | 42 | abstract: Within a prediction task, Graph Neural Networks (GNNs) use relational information as an inductive bias to enhance the model's accuracy. As task-relevant relations might be unknown, graph structure learning approaches have been proposed to learn them while solving the downstream prediction task. In this paper, we demonstrate that minimization of a point-prediction loss function, e.g., the mean absolute error, does not guarantee proper learning of the latent relational information and its associated uncertainty. Conversely, we prove that a suitable loss function on the stochastic model outputs simultaneously grants (i) the unknown adjacency matrix latent distribution and (ii) optimal performance on the prediction task. Finally, we propose a sampling-based method that solves this joint learning task. Empirical results validate our theoretical claims and demonstrate the effectiveness of the proposed approach. |
43 | 43 | bibtex: > |
44 | | - @misc{manenti2024learning, |
| 44 | + @inproceedings{manenti2025learning, |
45 | 45 | title = {Learning {{Latent Graph Structures}} and Their {{Uncertainty}}}, |
46 | 46 | author = {Manenti, Alessandro and Zambon, Daniele and Alippi, Cesare}, |
47 | | - year = {2024}, |
48 | | - month = may, |
49 | | - number = {arXiv:2405.19933}, |
50 | | - primaryclass = {cs, stat}, |
51 | | - publisher = {arXiv}, |
52 | | - archiveprefix = {arxiv} |
| 47 | + year = {2025}, |
| 48 | + booktitle={To appear in Proceedings of the Forty-Second International Conference on Machine Learning (ICML)}, |
53 | 49 | } |
54 | 50 | - title: 'Temporal Graph ODEs for Irregularly-Sampled Time Series' |
55 | 51 | links: |
|
120 | 116 | - title: Graph Deep Learning for Time Series Forecasting |
121 | 117 | links: |
122 | 118 | paper: https://arxiv.org/abs/2310.15978 |
123 | | - venue: Preprint |
124 | | - year: 2023 |
| 119 | + venue: To appear in ACM Computing Surveys |
| 120 | + year: 2025 |
125 | 121 | authors: |
126 | 122 | - id:acini |
127 | 123 | - id:imarisca |
|
131 | 127 | - spatiotemporal graphs |
132 | 128 | - forecasting |
133 | 129 | abstract: Graph-based deep learning methods have become popular tools to process collections of correlated time series. Differently from traditional multivariate forecasting methods, neural graph-based predictors take advantage of pairwise relationships by conditioning forecasts on a (possibly dynamic) graph spanning the time series collection. The conditioning can take the form of an architectural inductive bias on the neural forecasting architecture, resulting in a family of deep learning models called spatiotemporal graph neural networks. Such relational inductive biases enable the training of global forecasting models on large time-series collections, while at the same time localizing predictions w.r.t. each element in the set (i.e., graph nodes) by accounting for local correlations among them (i.e., graph edges). Indeed, recent theoretical and practical advances in graph neural networks and deep learning for time series forecasting make the adoption of such processing frameworks appealing and timely. However, most of the studies in the literature focus on proposing variations of existing neural architectures by taking advantage of modern deep learning practices, while foundational and methodological aspects have not been subject to systematic investigation. To fill the gap, this paper aims to introduce a comprehensive methodological framework that formalizes the forecasting problem and provides design principles for graph-based predictive models and methods to assess their performance. At the same time, together with an overview of the field, we provide design guidelines, recommendations, and best practices, as well as an in-depth discussion of open challenges and future research directions. |
| 130 | + bibtex: > |
| 131 | + @article{cini2025graph, |
| 132 | + title = {Graph {{Deep Learning}} for {{Time Series Forecasting}}}, |
| 133 | + author = {Cini, Andrea and Marisca, Ivan and Zambon, Daniele and Alippi, Cesare}, |
| 134 | + year = {2025}, |
| 135 | + journal={To appear in ACM Computing Surveys}, |
| 136 | + } |
134 | 137 | - title: Graph Representation Learning (special session at ESANN 2023) |
135 | 138 | links: |
136 | 139 | paper: https://doi.org/10.14428/esann/2023.ES2023-4 |
|
0 commit comments