Understanding the statistical laws governing citation dynamics remains a fundamental challenge in network theory and the science of science. Citation networks typically exhibit in-degree distributions well approximated by log-normal distributions yet also display power-law behaviour in the high-citation regime -- an apparent contradiction lacking a unified explanation. Here we identify a previously unrecognised phenomenon: the variance of the logarithm of citation counts per unit time follows a power law with respect to time ($t$) since publication, scaling as $t^{H}$, with $H$ constant. This discovery introduces a new challenge while simultaneously offering a crucial clue to resolving this discrepancy. We develop a stochastic model in which latent attention to publications evolves through a memory-driven process with cumulative advantage, modelled as fractional Brownian motion with Hurst parameter $H$ and volatility. We show that antipersistent fluctuations in attention ($H < 1/2$) yield log-normal citation distributions, whereas persistent attention dynamics ($H > 1/2$) favour heavy-tailed power laws, thus resolving the log-normal--power-law contradiction. Numerical simulations confirm both the $t^{H}$ law and the transition between regimes. Empirical analysis of arXiv e-prints indicates that the latent attention process is intrinsically antipersistent ($H \approx 0.13$). By linking memory effects and stochastic fluctuations in attention to broader network dynamics, our findings provide a unifying framework for understanding the evolution of collective attention in science and other attention-driven processes.
翻译:暂无翻译