Skip to main content

Improving Entity Recall In Automatic Speech Recognition With Neural Embeddings

Christopher Li, Pat Rondon, Diamantino Caseiro, Leonid Velikovich, Xavier Velez, Petar Aleksic

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:13:16
08 Jun 2021

Automatic speech recognition (ASR) systems often have difficulty recognizing long-tail entities such as contact names and local restaurant names, which usually do not occur, or occur infrequently, in the system’s training data. In this work, we present a method which uses learned text embeddings and nearest neighbor retrieval within a large database of entity embeddings to correct misrecognitions. Our text embeddings are produced by a neural network trained so that the embeddings of acoustically confusable phrases have low cosine distances. Given the embedding of the text of a potential entity misrecognition and a precomputed database containing entities and their corresponding embeddings, we use fast, scalable nearest neighbor retrieval algorithms to find candidate corrections within the database. The inserted candidates are then scored using a function of the original text’s cost in the lattice and the distance between the embedding of the original text and the embedding of the candidate correction. Using this lattice augmentation techique, we demonstrate a 46% reduction in word error rate (WER) and 46% reduction in oracle word error rate (OWER) on evaluation sets with popular film queries.

Chairs:
Duc Le

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
  • SPS
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00