Sequential Invariant Information Bottleneck
Yichen Zhang (Xi'an Jiaotong University, China); Shujian Yu (Vrije Universiteit Amsterdam); Badong Chen ("Xi'an Jiaotong University, China")
-
SPS
IEEE Members: $11.00
Non-members: $15.00
Previous approaches to the problem of generalization for out-of-distribution (OOD) data usually assume that data from each environment is available simultaneously, which is unrealistic in real-world applications. In this paper, we develop a new framework termed the sequential invariant information bottleneck (seq-IIB) to improve the generalization ability of learning agents in sequential environments. Our main idea is to combine the merits of the famed Information Bottleneck (IB) principle with the Invariant Risk Minimization (IRM), such that the learning agent can gradually remove spurious features and remain \emph{invariant} and \emph{compact} task-relevant information in a sequential manner. Experimental results on three MNIST-like datasets show the effectiveness of our method. Code and supplementary material are available at \url{https://github.com/SJYuCNEL/seq-IIB}.