Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:07:18
11 Jun 2021

We propose enhancing Transformer language models (BERT, RoBERTa) to take advantage of pauses. Pauses play an important role in speech. In previous work we developed a method to encode pauses in transcripts for recognition of Alzheimer’s disease. In this study, we extend this idea to language models. We re-train BERT and RoBERTa using a large collection of pause-encoded transcripts, and conduct fine-tuning for two downstream tasks, recognition of Alzheimer’s disease and emotion. Pause-encoded language models outperform text-only language models on these tasks. Pause augmentation by duration perturbation for training is shown to improve pause-encoded language models.

Chairs:
Mathew Magimai Doss

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00