当前位置: 首页  2014贵州省先进计算与医疗信息服务工程实验室  通知公告
20240520论文报告-Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting

报告题目:Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting

论文出处:ICLR2023

作者:Yunhao Zhang & Junchi Yan单位:MoE Key Lab of Artificial Intelligence, Shanghai Jiao Tong University and Shanghai AI Lab

报告人:康汝兵

报告时间:2024.5.20

报告地点:博学楼621

报告内容摘要:Recently many deep models have been proposed for multivariate time series (MTS)forecasting. In particular, Transformer-based models have shown great potentialbecause they can capture long-term dependency. However, existing Transformerbased models mainly focus on modeling the temporal dependency (cross-timedependency) yet often omit the dependency among different variables (crossdimension dependency), which is critical for MTS forecasting. To fill the gap, wepropose Crossformer, a Transformer-based model utilizing cross-dimension dependency for MTS forecasting. In Crossformer, the input MTS is embedded into a 2Dvector array through the Dimension-Segment-Wise (DSW) embedding to preservetime and dimension information. Then the Two-Stage Attention (TSA) layer isproposed to efficiently capture the cross-time and cross-dimension dependency.Utilizing DSW embedding and TSA layer, Crossformer establishes a HierarchicalEncoder-Decoder (HED) to use the information at different scales for the finalforecasting. Extensive experimental results on six real-world datasets show theeffectiveness of Crossformer against previous state-of-the-arts.


【关闭本页】 【返回顶部】