Toward Learning Joint Inference Tasks for IASS-MTS Using Dual Attention Memory With Stochastic Generative Imputation
Article
Wang, Zhen, Zhang, Yang, Pang, Yan, Wang, Nannan, Bah, Mohamed Jaward, Li, Ke and Zhang, Ji. 2023. "Toward Learning Joint Inference Tasks for IASS-MTS Using Dual Attention Memory With Stochastic Generative Imputation." IEEE Transactions on Neural Networks and Learning Systems. https://doi.org/10.1109/TNNLS.2023.3305542
Article Title | Toward Learning Joint Inference Tasks for IASS-MTS Using Dual Attention Memory With Stochastic Generative Imputation |
---|---|
ERA Journal ID | 4458 |
Article Category | Article |
Authors | Wang, Zhen, Zhang, Yang, Pang, Yan, Wang, Nannan, Bah, Mohamed Jaward, Li, Ke and Zhang, Ji |
Journal Title | IEEE Transactions on Neural Networks and Learning Systems |
Number of Pages | 15 |
Year | 2023 |
Publisher | IEEE (Institute of Electrical and Electronics Engineers) |
Place of Publication | United States |
ISSN | 1045-9227 |
1941-0093 | |
2162-237X | |
2162-2388 | |
Digital Object Identifier (DOI) | https://doi.org/10.1109/TNNLS.2023.3305542 |
Web Address (URL) | https://ieeexplore.ieee.org/abstract/document/10236563 |
Abstract | Irregularly, asynchronously and sparsely sampled multivariate time series (IASS-MTS) are characterized by sparse and uneven time intervals and nonsynchronous sampling rates, posing significant challenges for machine learning models to learn complex relationships within and beyond IASS-MTS to support various inference tasks. The existing methods typically either focus solely on single-task forecasting or simply concatenate them through a separate preprocessing imputation procedure for the subsequent classification application. However, these methods often ignore valuable annotated labels or fail to discover meaningful patterns from unlabeled data. Moreover, the approach of separate prefilling may introduce errors due to the noise in raw records, and thus degrade the downstream prediction performance. To overcome these challenges, we propose the time-aware dual attention and memory-augmented network (DAMA) with stochastic generative imputation (SGI). Our model constructs a joint task learning architecture that unifies imputation and classification tasks collaboratively. First, we design a new time-aware DAMA that accounts for irregular sampling rates, inherent data nonalignment, and sparse values in IASS-MTS data. The proposed network integrates both attention and memory to effectively analyze complex interactions within and across IASS-MTS for the classification task. Second, we develop the stochastic generative imputation (SGI) network that uses auxiliary information from sequence data for inferring the time series missing observations. By balancing joint tasks, our model facilitates interaction between them, leading to improved performance on both classification and imputation tasks. Third, we evaluate our model on real-world datasets and demonstrate its superior performance in terms of imputation accuracy and classification results, outperforming the baselines. |
Keywords | Data models |
ANZSRC Field of Research 2020 | 460299. Artificial intelligence not elsewhere classified |
Public Notes | Files associated with this item cannot be displayed due to copyright restrictions. |
Byline Affiliations | Zhejiang Lab, China |
Guangzhou University, China | |
Xidian University, China | |
Dalian Maritime University, China | |
School of Mathematics, Physics and Computing |
Permalink -
https://research.usq.edu.au/item/z271q/toward-learning-joint-inference-tasks-for-iass-mts-using-dual-attention-memory-with-stochastic-generative-imputation
41
total views0
total downloads1
views this month0
downloads this month