Cross Attention for time series transformers.

In the context of multivariate time series analysis, develop a novel form of cross-attention mechanism for transformers. The goal is to enhance the model's ability to distinguish attentional patterns between the variables targeted for prediction and supplementary features.

Requirements

  • M.Sc. in Machine Learning, Computer Science, Mathematics, Physics, or similar
  • Basic knowledge of time sries
  • Decent familiarity with transformes architecture and self-attention mechanism
  • Familirity with Python and Pytorch

Description

Transformers have seen widespread success in various sequence-based tasks, yet their performance in time-series analysis often falls short compared to other problem domains. In the realm of multivariate time-series, incorporating additional temporal features alongside target variables is crucial for accurate predictions. Conventionally, when using transformers, these supplementary features are simply concatenated with the target variable sequence, treating all features equally. However, this approach proves suboptimal as it fails to differentiate between the two types of features, neglecting any potential attentional interplay between them.
The thesis aims to address this limitation by introducing a novel cross-attention mechanism tailored specifically for multivariate time-series analysis. Unlike conventional methods, this mechanism aims to treat variables to predict and extra features separately, employing a distinct form of attention, namely cross-attention, to attend to each. By discerning and prioritizing the relevance of different features, the proposed approach seeks to enhance the transformer model’s ability to capture intricate temporal dependencies and improve predictive performance in multivariate time-series forecasting.
This thesis comprises of three steps: (1) familiarizing with attention and ccross-attention; (2) review of transformers for time-series and (3) developing novel cross-attention mechanism between target variables and extra temporal features.

Contacts