Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:07:18
11 May 2022

Text classification is playing an increasingly important role in natural language processing (NLP). Most research adopts deep structure neural networks to achieve text classification tasks. However, deep structure networks often suffer from time-consuming trainning process and hardware dependence. In this paper, a flat network called broad learning system (BLS) is employed to derive a novel learning method --- node slicing broad learning system (NSBLS). Firstly, one-to-one correspondence between the words and the feature node groups is established to obtain a feature layer with rich words, on the basic of which the enhancement layer is generated representing the global information. Then we activate some nodes in the feature node groups, compact them with the enhancement layer and use ridge regression to obtain multiple outputs. Finally, an intergration BLS layer is used to correct and combine the multiple outputs to get the final output. The experiment shows that NSBLS has good performance on several datasets.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00