Skip to main content

A MEMORY-FREE EVOLVING BIPOLAR NEURAL NETWORK FOR EFFICIENT MULTI-LABEL STREAM LEARNING

Sourav Mishra (Indian Institute of Science, Bangalore); Suresh Sundaram (Indian Institute of Science)

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
07 Jun 2023

Many fields, like document tagging, video labeling, and medical analysis, require associating the samples with multiple non-exclusive labels, driving the research in multi-label learning. Unlike several existing multi-label learning setups, practical applications are challenging because they need learning from a stream of samples and labels. This work proposes an evolving bipolar network architecture (EBN-MSL) consisting of two parallel layers trained in a maximum margin framework to learn efficiently in a continual multi-label learning scenario without utilizing any samples stored from previous tasks. This work considers two learning setups, one for separately learning each task and another for jointly learning subsequent tasks. Experiments on benchmark multi-label learning datasets establish the superior learning capability of EBN-MSL in the presence of samples having all positive or negative labels. Results indicate that EBN-MSL significantly outperforms the current state-of-the-art architecture-based continual multi-label learning algorithm.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00