Skip to main content

LOW-COMPLEXITY ATTENTION MODELLING VIA GRAPH TENSOR NETWORKS

Yao Lei Xu, Kriton Konstantinidis, Shengxi Li, Danilo P. Mandic, Ljubi?a Stankovi?

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:11:26
10 May 2022

The attention mechanism is at the core of modern Natural Language Processing (NLP) models, owing to its ability to focus on the most contextually relevant part of a sequence. However, current attention models rely on "flat-view" matrix methods to process tokens embedded in vector spaces; this results in exceedingly high parameter complexity which is prohibitive for practical applications. To this end, we introduce a novel Tensorized Graph Attention (TGA) mechanism, which leverages on the recent Graph Tensor Network (GTN) framework to efficiently process tensorized token embeddings via attention based graph filters. Such tensorized token embeddings are shown to effectively bypass the Curse of Dimensionality, reducing the parameter complexity of the attention mechanism from an exponential to a linear one in the embedding dimensions. The expressive power of the TGA framework is further enhanced by virtue of domain-aware graph convolution filters. Simulations across benchmark NLP paradigms verify the advantages of the proposed framework over existing attention models, at drastically lower parameter complexity.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00