Abstract
We introduce an innovative and mathematically rigorous definition for computing common information from multi-view data, drawing inspiration from Gács-Körner common information in information theory. Leveraging this definition, we develop a novel supervised multi-view learning framework to capture both common and unique information. By explicitly minimizing a total correlation term, the extracted common information and the unique information from each view are forced to be independent of each other, which, in turn, theoretically guarantees the effectiveness of our framework. To estimate information-theoretic quantities, our framework employs matrix-based Rényi's α-order entropy functional, which forgoes the need for variational approximation and distributional estimation in high-dimensional space. Theoretical proof is provided that our framework can faithfully discover both common and unique information from multi-view data. Experiments on synthetic and seven benchmark real-world datasets demonstrate the superior performance of our proposed framework over state-of-the-art approaches.
Original language | English |
---|---|
Article number | 102400 |
Pages (from-to) | 1-11 |
Number of pages | 11 |
Journal | Information Fusion |
Volume | 108 |
Early online date | 4 Apr 2024 |
DOIs | |
Publication status | Published - Aug 2024 |
Bibliographical note
Publisher Copyright:© 2024 Elsevier B.V.
Funding
This work was supported by the National Natural Science Foundation of China under grant number U21A20485 , 62311540022 and 62088102 .
Funders | Funder number |
---|---|
National Natural Science Foundation of China | U21A20485, 62311540022, 62088102 |
National Natural Science Foundation of China |
Keywords
- Common information
- Matrix-based Rényi's α-order entropy functional
- Multi-view learning
- Total correlation