TY - UNPB
T1 - Feature-Based Network Construction: From Sampling to What-if Analysis
AU - Franssen, Christian
AU - Berkhout, Joost
AU - Heidergott, Bernd
PY - 2024/12/6
Y1 - 2024/12/6
N2 - Networks are characterized by structural features, such as degree distribution, triangular closures, and assortativity. This paper addresses the problem of reconstructing instances of continuously (and non-negatively) weighted networks from given feature values. We introduce the gradient-based Feature-Based Network Construction (FBNC) framework. FBNC allows for sampling networks that satisfy prespecified features exactly (hard constraint sampling). Initializing the FBNC gradient descent with a random graph, FBNC can be used as an alternative to exponential random graphs in sampling graphs conditional on given feature values. We establish an implicit regularization approach to the original feature-fitting loss minimization problem so that FBNC achieves a parsimonious change in the underlying graph, where the term “implicit” stems from using appropriate norms in the very construction of the FBNC gradient descent. In constructing the implicit regularization, we distinguish between the case where weights of a link can be chosen from a bounded range, and, the more demanding case, where the weight matrix of the graph constitutes a Markov chain. We show that FBNC expands to “what-if analysis” of networks, that is, for a given initial network and a set of features satisfied by this network, FBNC finds the network closest to the initial network with some of the feature values adjusted or new features added. Numerical experiments in social network management and financial network regulation demonstrate the value of FBNC for graph (re)construction and what-if analysis.
AB - Networks are characterized by structural features, such as degree distribution, triangular closures, and assortativity. This paper addresses the problem of reconstructing instances of continuously (and non-negatively) weighted networks from given feature values. We introduce the gradient-based Feature-Based Network Construction (FBNC) framework. FBNC allows for sampling networks that satisfy prespecified features exactly (hard constraint sampling). Initializing the FBNC gradient descent with a random graph, FBNC can be used as an alternative to exponential random graphs in sampling graphs conditional on given feature values. We establish an implicit regularization approach to the original feature-fitting loss minimization problem so that FBNC achieves a parsimonious change in the underlying graph, where the term “implicit” stems from using appropriate norms in the very construction of the FBNC gradient descent. In constructing the implicit regularization, we distinguish between the case where weights of a link can be chosen from a bounded range, and, the more demanding case, where the weight matrix of the graph constitutes a Markov chain. We show that FBNC expands to “what-if analysis” of networks, that is, for a given initial network and a set of features satisfied by this network, FBNC finds the network closest to the initial network with some of the feature values adjusted or new features added. Numerical experiments in social network management and financial network regulation demonstrate the value of FBNC for graph (re)construction and what-if analysis.
KW - network construction
KW - steepest feasible descent
KW - network what-if analysis
KW - implicit regularization
U2 - 10.48550/arXiv.2412.05124
DO - 10.48550/arXiv.2412.05124
M3 - Preprint
BT - Feature-Based Network Construction: From Sampling to What-if Analysis
PB - arXiv.org
ER -