Skip to main content

MHLAT: Multi-hop Label-wise Attention Model for Automatic ICD Coding

Junwen Duan (Central South University); Han Jiang (Central South University); Ying Yu (Central South University)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

International Classification of Diseases (ICD) coding is a task to assign ICD diagnosis codes to clinical notes. This can be challenging given the large quantity of labels (nearly 9,000) and lengthy texts (up to 8,000 tokens). Previous studies have investigated the potential of label-wise attention mechanism. However, unlike single pass reading process in previous works, our human tend to read the text and label definition again to get more confident answers. Moreover, pretrained language models have been used in this problems while suffered from huge memory usage. To address above problems, we propose a simple but effective model called Multi-Hop Label-wise ATtention (MHLAT) model in which multi-hop label-wise attention is deployed to get more precise and informative representations. Extensive experiments on three benchmark MIMIC datasets indicate that our method achieves significantly better or competitive performance on all the seven metrics, with much fewer parameters to optimize.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00