Skip to main content

Expert Session: EXP-6: How versatile are self-supervised models?

Hung-yi Lee, National Taiwan University, Taiwan

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:56:48
26 May 2022

Self-supervised learning (SSL) has shown to be vital for advancing research in natural language processing (NLP), computer vision (CV), and speech processing. The paradigm pre-trains a shared model on large volumes of unlabeled data and achieves state-of-the-art for various tasks with minimal adaptation. This talk will share some interesting findings from the SSL models. For example, why do SSL models like BERT perform so well on NLP tasks? Generally, BERT is considered powerful in NLP because it can learn the semantics of words from large amounts of text data. Is this real? This talk will showcase some recent findings on the interdisciplinary capabilities of the SSL models that will change the way you think about the SSL models. This talk has little overlap with the ICASSP 2022 tutorial "Self-supervised Representation Learning for Speech Processing".

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00