The Informational Coherence Index: A Metric for Evaluating Integration in Networks of Artificial Intelligence Models
We introduce the Informational Coherence Index (Icoer), a bounded metric for quantifying the degree of integration and alignment among interconnected artificial intelligence (AI) models. The index combines four interpretable components: normalized processing capacity, a Gaussian informational coupling function that decays with inter-model distance, an entropy-based weight reflecting output uncertainty, and a Lorentzian resonance factor capturing synchronicity. We prove that Icoer ∈ [0, 1] and is monotonically decreasing in both informational distance and entropy. We derive closed-form gradients for optimizing network coherence via gradient ascent and demonstrate convergence on networks of up to 100 models. A pairwise extension of the index is also proposed for settings where inter-model distances are defined over embedding spaces. Simulation experiments confirm that the metric exhibits proportional sensitivity to parameter variations, correctly identifies informational bottlenecks, and scales to large networks. We discuss the relationship between Icoer and established measures such as agreement rate and mutual information, and outline directions for empirical validation on real AI systems.