🔬

Information Theory

1
Open Unknowns
74
Cross-Domain Bridges
10
Active Hypotheses

Cross-Domain Bridges

Bridge The "grokking" generalisation transition in deep learning is a second-order phase transition governed by the same universality classes that describe magnetisation, percolation, and neural avalanches in physical systems.

Fields: Machine Learning, Statistical Physics, Information Theory, Neuroscience

Grokking — the phenomenon where a neural network suddenly transitions from memorisation to generalisation after a long plateau — exhibits sharp, non-analytic changes in the effective dimensionality of...

Bridge Aesthetic preference correlates with intermediate algorithmic complexity: Birkhoff's measure M = O/C, Kolmogorov complexity, and fractal dimension operationalise the information-theoretic "sweet spot" between randomness and repetition, unifying aesthetics with mathematics and cognitive science.

Fields: Aesthetics, Cognitive Science, Information Theory, Mathematics, Music Cognition, Visual Neuroscience

Birkhoff (1933) defined aesthetic measure as M = O/C — order divided by complexity. High order with low complexity (a single constant tone, a uniform colour field) has M → ∞ but is perceived as boring...

Bridge The black hole information paradox is an information-theoretic crisis: whether quantum gravity destroys von Neumann entropy is equivalent to whether the black hole acts as a quantum channel with zero capacity, and the holographic principle (AdS/CFT) resolves this by identifying bulk gravity with a boundary quantum error-correcting code.

Fields: Astronomy, Quantum Gravity, Information Theory, Quantum Error Correction

Hawking's 1974 calculation showed that black holes radiate thermally, apparently destroying the quantum information contained in infalling matter. This is the information paradox: unitary quantum mech...

Bridge The Bekenstein-Hawking entropy S = A/4 (area, not volume) of a black hole implies the holographic principle — that the maximum information content of any 3D region is bounded by its 2D boundary area, making information theory and spacetime geometry equivalent at the Planck scale.

Fields: Astrophysics, Information Theory, Quantum Gravity, Theoretical Physics

The discovery that black holes have entropy proportional to their surface area — not volume — is the most profound known connection between spacetime geometry and information theory. 1. Bekenstein-Haw...

Bridge CRISPR Base Editing x Error Correction - adenine base editor as bit-flip corrector

Fields: Biology, Computer Science, Information Theory

Adenine base editors (ABEs) convert A-T to G-C base pairs without double-strand breaks, implementing a precise one-bit correction in the genomic information channel; the specificity window (protospace...

Bridge Gene Expression Noise x Information Theory - transcriptional channel capacity

Fields: Biology, Computer Science, Information Theory

Gene regulatory networks face a fundamental channel capacity limit: the maximum mutual information between transcription factor concentration (input) and target gene expression (output) is bounded by ...

Bridge Information Theory x Evolutionary Biology — natural selection as Bayesian inference

Fields: Biology, Computer Science, Information Theory, Evolutionary Biology

Natural selection updates the population's genetic prior toward higher fitness using the same mathematical operation as Bayesian belief updating; Fisher's fundamental theorem of natural selection is t...

Bridge Neural spike coding x Information compression — retinal ganglion cells as efficient encoders

Fields: Neuroscience, Computer Science, Information Theory

Retinal ganglion cell spike trains are efficient codes in the information-theoretic sense; center-surround receptive fields implement a whitening filter that removes spatial redundancy in natural imag...

Bridge The genetic code is a near-optimal digital error-correcting code: codon degeneracy implements a natural parity-check scheme that minimises the chemical impact of single-base mutations, and the 64-codon/20-amino-acid mapping operates near the Shannon capacity of the DNA replication channel.

Fields: Molecular Biology, Information Theory, Coding Theory, Evolutionary Biology, Genetics

Shannon's channel coding theorem (1948) establishes that for any noisy channel with capacity C = B log₂(1 + SNR), there exist codes that transmit information with arbitrarily small error probability a...

Bridge Codon usage bias encodes translational kinetics as an information channel: synonymous codons are not equivalent in translation speed, and organisms optimise codon usage to maximise ribosome throughput — a rate-distortion problem where the coding redundancy of the genetic code is exploited to tune the channel capacity of the translation machinery.

Fields: Molecular Biology, Information Theory, Computational Biology

The genetic code has 64 codons encoding 20 amino acids plus stop signals, giving ~1.5 bits of coding redundancy per codon. Synonymous codons (different codons for the same amino acid) are used non-uni...

Bridge Collective animal behaviors — fish schooling, bird murmurations, insect swarms — use information cascade and quorum sensing mechanisms that bridge biology and information theory: individuals integrate local signals to make collective decisions whose speed, accuracy, and robustness are governed by the same signal detection and information aggregation principles as engineered sensor networks.

Fields: Biology, Information Theory, Collective Behavior

Quorum sensing in bacteria: the threshold concentration S_q where gene expression switches satisfies ∂F/∂S = 0 (hill function bistability), giving a sharp collective switch at population density N > N...

Bridge Multiplexed CRISPR perturbation screens pool many distinct guide RNAs or targets into bulk assays and infer genetic effects by decoding barcode identities — abstractly reminiscent of designing redundant identifiers so pooled measurements tolerate dropout or misreads — **not** claiming biological machinery implements Reed–Solomon codes; only an information-design analogy for experimental planning.

Fields: Biology, Information Theory, Genomics

High-throughput pooled CRISPR experiments assign binary-like signatures to perturbations so downstream sequencing demultiplexes signals — coding theory supplies intuition about Hamming distance and re...

Bridge Kauffman's NK model maps gene regulatory networks onto Boolean circuits — cell types are attractors and the critical K=2 regime corresponds to edge-of-chaos dynamics

Fields: Biology, Information Theory, Computer Science

Kauffman (1969) modeled gene regulatory networks as Boolean networks: N genes each updated by a Boolean function of K randomly chosen inputs. For K < 2, networks freeze in ordered attractors; for K > ...

Bridge The sequence specificity of protein-DNA binding is quantified by information theory: the sequence logo information content (bits) equals the reduction in positional entropy, and the total information in a binding site predicts the number of sites in a genome.

Fields: Molecular Biology, Information Theory

Schneider & Stephens (1990) showed that transcription factor binding sites can be quantified as information in bits: the information content Ri = 2 − H(position), where H is Shannon entropy over the f...

Bridge Stochastic resonance in nonlinear biochemical sensors links noise-assisted threshold crossing to information-detection gains in weak biological signaling.

Fields: Biophysics, Information Theory, Systems Biology, Nonlinear Dynamics

In excitable and threshold-like cellular pathways, moderate noise can increase detectability of weak periodic inputs by synchronizing barrier crossings with subthreshold stimuli. This maps directly to...

Bridge The efficient coding hypothesis (Barlow 1961) unifies sensory neuroscience and information theory: retinal whitening, V1 Gabor receptive fields, and auditory log-frequency tuning all follow from maximizing Shannon information transmission per unit metabolic cost.

Fields: Neuroscience, Cognitive Science, Information Theory, Sensory Physiology, Computational Neuroscience

Barlow (1961) proposed that the goal of sensory processing is to represent the environment using the minimum number of active neurons — equivalently, to maximize the Shannon mutual information I(stimu...

Bridge Computational complexity and phase transitions — NP-hard problem hardness exhibits thermodynamic-like phase transitions governed by the same statistical physics of disordered systems

Fields: Computer Science, Mathematics, Statistical Physics, Combinatorics, Information Theory

Many NP-complete problems (3-SAT, graph coloring, random k-SAT, traveling salesman) exhibit sharp phase transitions in their typical-case hardness as a control parameter varies. In random k-SAT: let α...

Bridge Contrastive self-supervised learning — pulling positive pairs together and pushing negatives apart — resembles learning energy-based and Boltzmann-machine style scores where temperature controls sharpness of discrimination.

Fields: Machine Learning, Statistical Physics, Computer Science, Information Theory

Energy-based models assign low energy to plausible configurations; training shapes the energy landscape so that data lie in wells. Contrastive objectives such as InfoNCE reweight logits of positive ve...

Bridge PAC learning theory ↔ statistical generalisation — VC dimension as the degrees of freedom of a hypothesis class

Fields: Computer Science, Theoretical Machine Learning, Statistics, Statistical Physics, Information Theory

PAC (Probably Approximately Correct) learning theory (Valiant 1984) provides a mathematical framework for when a learning algorithm can generalise from training data to unseen examples. A concept clas...

Bridge DNA replication x Error-correcting codes - polymerase proofreading as channel coding

Fields: Biology, Computer_Science, Information_Theory, Molecular_Biology

DNA replication achieves an error rate of approximately 10^-9 per base through a three-stage error-correction pipeline (polymerase insertion selectivity 10^-5, 3'to5' exonuclease proofreading 10^-2, p...

Bridge Shannon entropy applied to species relative abundances gives the Shannon diversity index; Hill numbers unify Shannon (q→1), Simpson (q=2), and species richness (q=0) as the Rényi entropy family applied to ecology; and MaxEnt models derive species abundance distributions from the same thermodynamic analogy that produces the Boltzmann distribution.

Fields: Ecology, Biodiversity Science, Information Theory, Statistical Mechanics, Biogeography

Shannon's entropy H = -Σ_i p_i log p_i applied to species i with relative abundance p_i is used directly as a biodiversity index (H' or Shannon diversity), quantifying uncertainty in the species ident...

Bridge MaxEnt species distribution modelling is the ecological application of Jaynes' maximum entropy principle: given presence-only occurrence data and environmental features, MaxEnt finds the distribution of maximum entropy subject to empirical feature constraints — a result formally identical to a Gibbs distribution and to maximum likelihood estimation in a Poisson point process model.

Fields: Ecology, Statistics, Information Theory, Conservation Biology, Bayesian Inference

Jaynes (1957) formulated the maximum entropy (MaxEnt) principle for statistical inference: among all probability distributions consistent with known constraints (expected values of observable features...

Bridge The Efficient Market Hypothesis (Fama 1970) — that asset prices reflect all available information — is the statement that price processes are martingales (E[P_{t+1}|F_t] = P_t); market anomalies are quantifiable as residual mutual information between price history and future returns.

Fields: Economics, Information Theory, Probability Theory, Finance, Stochastic Processes

Fama (1970) defined the Efficient Market Hypothesis (EMH): asset prices fully reflect all available information. Samuelson (1965) showed that this is mathematically equivalent to the statement that pr...

Bridge The Boltzmann-Gibbs exponential wealth distribution arising from entropy maximization subject to wealth conservation is the economic analog of the Maxwell-Boltzmann energy distribution in statistical mechanics: mean wealth is the economic "temperature," wealth exchanges are binary collisions, and the Lorenz curve is the cumulative distribution function of kinetic energy.

Fields: Economics, Statistical Physics, Econophysics, Information Theory

Dragulescu & Yakovenko (2000) demonstrated that if economic agents exchange wealth in random pairwise interactions conserving total wealth (analogous to elastic collisions conserving energy), the stat...

Bridge Maxwell's equations in free space predict plane wave solutions with the same mathematical form as carrier waves in communications — the electromagnetic spectrum is a physical implementation of Shannon's abstract channel model.

Fields: Electromagnetism, Information Theory, Communications Engineering

Maxwell's equations in free space admit plane wave solutions of the form E = E₀ exp(i(k·r − ωt)), which are identical in mathematical structure to the carrier waves used in all radio, microwave, and o...

Bridge Shannon's source coding theorem establishes that the entropy H of a source is the fundamental limit of lossless compression, while rate-distortion theory provides the optimal lossy compression bound R(D) — limits that Huffman coding, arithmetic coding, and Lempel-Ziv algorithms approach through distinct mathematical strategies, and that JPEG/MP3 operate near in practice.

Fields: Engineering, Mathematics, Information Theory, Computer Science

Shannon's source coding theorem (1948) proves that a source with entropy H bits/ symbol can be losslessly compressed to H bits/symbol on average but not below — setting an absolute mathematical lower ...

Bridge The adaptive immune system solves a high-dimensional pattern detection problem using stochastic V(D)J recombination to generate a diverse receptor repertoire, thymic selection to set affinity thresholds, and clonal expansion as a Bayesian posterior update — mathematically equivalent to a noisy channel decoder for self/non-self discrimination.

Fields: Immunology, Physics, Information Theory, Statistical Mechanics, Mathematics

The adaptive immune system must recognize ~10¹⁵ possible foreign antigens using only ~10⁷ circulating T-cell clones (each with a distinct T-cell receptor, TCR). This is a covering problem: the T-cell ...

Bridge Eigen's quasispecies error threshold in molecular evolution and Shannon's channel capacity theorem in information theory are the same mathematical result — the mutation rate at which genetic information is irreversibly lost is the Shannon capacity of the replication channel.

Fields: Information Theory, Molecular Evolution, Statistical Physics, Virology

Manfred Eigen's quasispecies theory (1971) shows that a replicating population of sequences (RNA, DNA, or proteins) undergoes a phase transition at a critical mutation rate mu_c: below mu_c, a "master...

Bridge Scientific knowledge overload is a channel-capacity problem: the rate of cross-domain insight generation is limited not by the volume of published results but by the bandwidth of the translation layer between domain vocabularies — structured cross-domain bridges function as a lossless codec reducing mutual information distance without destroying signal.

Fields: Information Theory, Epistemology, Network Science, Cognitive Science, Library Science, Science Of Science

Shannon's channel capacity theorem (C = B log₂(1 + S/N)) provides a formal framework for the scientific knowledge overload problem. Consider each scientific domain as a transmitter and each researcher...

Bridge Belief propagation on factor graphs bridges probabilistic inference in computer science with haplotype phasing and genotype imputation pipelines in statistical genetics.

Fields: Information Theory, Genetics, Computer Science

Established engineering practice uses sum-product / approximate message passing algorithms on graphical models for large-scale genotype phasing and related inference tasks; residual speculative analog...

Bridge DNA is a digital information storage medium whose structure, redundancy, and mutation dynamics are quantitatively captured by Shannon's information theory — the genetic code is a natural error-correcting code whose properties minimize the cost of single-nucleotide substitutions.

Fields: Information Theory, Molecular Biology, Genetics, Evolutionary Biology

Shannon's (1948) framework maps onto molecular genetics with striking precision. The DNA alphabet has size q = 4 (A, T, G, C), so the maximum entropy per position is log₂(4) = 2 bits. The information ...

Bridge Stochastic process entropy rate h limits optimal prediction bits per symbol for stationary ergodic sources — connecting to cross-entropy training objectives for language models whose perplexity exp(H) measures geometric mean uncertainty per token under the model distribution versus empirical text statistics.

Fields: Information Theory, Computational Linguistics, Machine Learning

Shannon–McMillan–Breiman asymptotic equipartition implies typical sequences carry ~nh bits per n symbols for ergodic processes with entropy rate h. Neural language models minimize average negative log...

Bridge Zipf's law (word frequency proportional to 1/rank) is derivable from the principle of least effort — a communication system minimising joint speaker-listener effort converges on a power-law frequency distribution identical to Shannon's optimal coding theorem applied to natural language.

Fields: Linguistics, Information Theory, Cognitive Science, Statistical Physics, Complexity Science

Zipf (1949) observed that the frequency of a word is inversely proportional to its rank in the frequency table: f(r) ∝ 1/r. This power law appears in word frequencies across all natural languages, cit...

Bridge Chaos x Ergodic theory - sensitivity as mixing

Fields: Mathematics, Physics, Dynamical_Systems, Information_Theory

Deterministic chaos (positive Lyapunov exponents, sensitive dependence on initial conditions) is the physical manifestation of ergodic mixing in measure-preserving dynamical systems; the Kolmogorov-Si...

Bridge The Fisher information matrix on the space of allele frequency distributions defines the Shahshahani Riemannian metric on population-genetic state space, making Amari's natural gradient descent in statistical learning the exact formal counterpart of Fisher's fundamental theorem — the rate of mean fitness increase equals the Fisher information about the selective environment.

Fields: Mathematics, Evolutionary Biology, Information Theory, Statistics

The space of probability distributions over a discrete variable forms a Riemannian manifold equipped with the Fisher information metric g_{ij} = E[∂_i log p · ∂_j log p], where i,j index parameters of...

Bridge Zipf's law (word frequency f_r ∝ r^{-α}, α ≈ 1) emerges from entropy maximisation in communication systems — it is the signature of a channel operating at maximum communicative efficiency minimising joint speaker-listener effort, and the same power law appears in city sizes, income distributions, citation counts, and any rank-frequency distribution generated by an entropy-maximising process under a frequency constraint.

Fields: Linguistics, Information Theory, Mathematics, Statistical Physics, Cognitive Science

Zipf (1935, 1949) documented that in any natural language corpus the r-th most frequent word has frequency f_r ≈ C / r (Zipf's law, exponent α = 1 exactly). He proposed a "principle of least effort": ...

Bridge Friston's free energy principle — the brain as a hierarchical generative model minimising variational free energy F = KL[q(θ)||p(θ|data)] ≥ −log p(data) — unifies Bayesian inference, predictive coding, perception, action, and attention as gradient descent on surprise, with clinical implications for hallucination and schizophrenia as precision-weighting failures.

Fields: Mathematics, Neuroscience, Cognitive Science, Statistics, Information Theory

The predictive coding framework (Rao & Ballard 1999) proposes that cortical processing is bidirectional: top-down connections carry predictions x̂_L = f(x_{L+1}) from higher to lower levels, while bot...

Bridge Brain-computer interfaces decode motor intentions from cortical population activity using linear decoders (Wiener filter) and Kalman state-space models — Fisher information in the neural population code sets the fundamental accuracy bound, connecting information theory to neural prosthetics engineering.

Fields: Neuroscience, Engineering, Neural Engineering, Information Theory, Signal Processing

BCIs decode intended movement from neural population activity recorded by electrode arrays. Linear decoding: ŷ = Wx + b where x ∈ R^N is the spike rate vector from N neurons, y is decoded kinematics (...

Bridge Sensory neurons as Shannon information channels — efficient coding and neural channel capacity

Fields: Neuroscience, Information Theory, Sensory Physiology, Computational Neuroscience

The nervous system encodes stimuli as spike trains — discrete all-or-none action potentials — which can be analysed as Shannon communication channels. The channel capacity C = B log₂(1 + S/N) bounds t...

Bridge Intrinsic motivation and autonomy as defined in self-determination theory are operationalisable as information-theoretic quantities — specifically, empowerment (the maximum mutual information between an agent's actions and their future states) and free-energy minimization — providing a neurocomputational mechanism for why autonomy need satisfaction predicts psychological well-being.

Fields: Neuroscience, Information Theory, Cognitive Science, Psychology

Ryan and Deci (2000, 27 k citations) established that intrinsic motivation, competence, and autonomy are fundamental psychological needs whose satisfaction predicts well-being. Information theory and ...

Bridge Integrated Information Theory (IIT) proposes that consciousness corresponds to integrated information Φ — a measure of how much a system generates information above and beyond its parts — connecting neuroscience to information theory, statistical mechanics, and the mathematics of causal structure.

Fields: Neuroscience, Mathematics, Information Theory

IIT (Tononi 2004, 2014) defines Φ as the minimum information generated by a system as a whole beyond its minimum information partition (MIP). Mathematically, Φ is a measure over a causal structure (di...

Bridge Friston's Free Energy Principle in theoretical neuroscience is formally isomorphic to thermodynamic free energy minimisation in statistical mechanics: the KL divergence between approximate and true posterior plays the role of entropy, and active inference (action minimises surprise) is the biological analogue of thermodynamic relaxation toward equilibrium.

Fields: Theoretical Neuroscience, Cognitive Science, Statistical Physics, Thermodynamics, Information Theory

The thermodynamic free energy in statistical mechanics is F = U - TS, where U is internal energy, T is temperature, and S is entropy. A system at equilibrium minimises F, which is equivalent to maximi...

Bridge Sensory perception bridges neuroscience and physics through Weber-Fechner psychophysics: the nervous system compresses physical stimulus intensity logarithmically (Fechner) or as a power law (Stevens), with the neural implementation explained by efficient coding theory — sensory neurons maximize mutual information between stimuli and responses given metabolic constraints, naturally producing logarithmic compression.

Fields: Neuroscience, Psychophysics, Physics, Information Theory, Sensory Biology, Cognitive Science

Weber's law (1834): the just noticeable difference ΔS for a stimulus of intensity S is proportional to S: ΔS/S = k (Weber fraction, constant per modality). For brightness, k ≈ 0.02; for weight, k ≈ 0....

Bridge Brain-computer interfaces achieve maximum information transfer rate when neural population activity is decoded using optimal Bayesian filters, connecting neuroscience spike train statistics to the signal processing framework of Kalman filtering and Fisher information bounds.

Fields: Neuroscience, Signal Processing, Information Theory

The problem of decoding motor intent from neural population activity is an optimal state estimation problem: spike trains from N neurons encode a low-dimensional movement state x(t) with Fisher inform...

Bridge The best scientific theory is the shortest program that computes the observed data — Kolmogorov complexity K(x) formalises Occam's razor as data compression, making scientific explanation equivalent to finding the minimum description length (MDL) model, and overfitting identical to using a description that is longer than necessary.

Fields: Philosophy Of Science, Information Theory, Mathematics, Statistics, Machine Learning

Kolmogorov (1965) defined the complexity K(x) of a string x as the length (in bits) of the shortest program on a universal Turing machine U that outputs x and halts. Solomonoff (1964) independently de...

Bridge Statistical physics phase transitions ↔ sudden generalization (grokking), double descent, and loss landscape geometry in deep learning

Fields: Statistical Physics, Machine Learning, Information Theory

Deep neural networks undergo a series of phenomena that are strikingly described by the language of statistical physics phase transitions: 1. **Grokking (Power et al. 2022)**: a model trains to 100% t...

Bridge Integrated information theory (Tononi 2004) quantifies consciousness as Φ — the information generated by a system above and beyond its parts — while Friston's free energy principle connects conscious inference to entropy minimization, together posing the deepest open question about the relationship between physical entropy and phenomenal experience.

Fields: Physics, Thermodynamics, Information Theory, Cognitive Science, Consciousness Studies, Neuroscience

Integrated information theory (IIT; Tononi 2004) defines consciousness as Φ, the amount of irreducible integrated information: the effective information generated by the whole system above and beyond ...

Bridge Renormalization x Data Compression - irrelevant operators as redundant bits

Fields: Physics, Computer Science, Information Theory

Lossy data compression (JPEG, MP3, rate-distortion theory) and the renormalization group (integrating out short-scale fluctuations) both perform optimal coarse- graining: both discard information that...

Bridge Thermodynamics x Information Theory — entropy as the universal currency

Fields: Physics, Computer Science, Information Theory

Boltzmann's thermodynamic entropy S = k_B ln Omega and Shannon's information entropy H = -sum p_i log p_i are the same mathematical object; physical heat dissipation and information erasure are two fa...

Bridge Jaynes's maximum-entropy (MaxEnt) principle from statistical mechanics — applied with macroecological state variables as constraints — predicts species abundance distributions, species-area relationships, and metabolic scaling in ecological communities with no free parameters, demonstrating that biodiversity patterns emerge from information-theoretic constraints rather than species-specific biology.

Fields: Statistical Mechanics, Macroecology, Information Theory, Biodiversity Science

Jaynes (1957) showed that the Boltzmann-Gibbs distribution is the unique probability distribution that maximizes Shannon entropy subject to known macroscopic constraints (e.g. fixed mean energy). Hart...

Bridge Rational Inattention x Shannon Entropy - cognitive bandwidth as information cost

Fields: Economics, Computer Science, Information Theory

Sims' rational inattention model formalizes attention as a scarce cognitive resource with Shannon mutual information as the cost; optimal attention allocation under entropy cost produces price stickin...

Bridge Phase-preserving amplifiers add quantum noise bounded by Heisenberg uncertainty — when expressed as excess over classical Johnson noise at the input, this yields a fundamental noise figure floor near 3 dB at high gain for conventional quadrature devices (quantum optics ↔ microwave engineering).

Fields: Quantum Physics, Microwave Engineering, Electrical Engineering, Information Theory

Caves derived that a linear phase-preserving amplifier with large gain must introduce noise equivalent to at least half a quantum at the input port when referenced against the signal quadrature, trans...

Bridge Thermodynamics of Computing and Energy Limits — Landauer's principle, reversible logic, neuromorphic architectures, and the brain's energy efficiency define fundamental and practical computing bounds

Fields: Physics, Computer Engineering, Thermodynamics, Neuromorphic Computing, Information Theory

Landauer's principle (1961) establishes that logically irreversible operations — those that erase information — must dissipate at least k_BT ln 2 ≈ 3×10⁻²¹ J per bit at room temperature into the envir...

Bridge Thermodynamic entropy increase, Landauer's information-erasure bound, and the cosmological arrow of time are three faces of the same asymmetry — a unified account requires identifying which low-entropy boundary condition (past hypothesis, Penrose's Weyl curvature, quantum decoherence) breaks time-reversal invariance at each scale.

Fields: Thermodynamics, Information Theory, Cosmology, Statistical Mechanics

Three apparently separate arrows of time — thermodynamic (entropy increases), computational (Landauer: erasing one bit dissipates at least k_B T ln 2 of heat), and cosmological (the universe began in ...

Bridge Landauer's principle ↔ thermodynamic cost of information erasure (Maxwell's demon resolution)

Fields: Thermodynamics, Information Theory, Statistical Physics, Computer Science

Landauer (1961) proved that erasing one bit of information in a thermal environment at temperature T requires dissipating at least k_B * T * ln(2) of free energy as heat — approximately 3 zJ at room t...

Bridge Renyi entropy x Multifractal spectra - generalized entropy as scaling exponent

Fields: Mathematics, Physics, Information_Theory, Dynamical_Systems

The Renyi entropy of order q, H_q = (1/(1-q)) log sum_i p_i^q, generates the full multifractal spectrum f(alpha) via Legendre transform tau(q) -> f(alpha); turbulent velocity fields, strange attractor...

Bridge Bekenstein-Hawking entropy S_BH = A/4l_P² (area law) and the holographic bound connect black hole thermodynamics to information theory; the Page curve and island formula (replica wormholes) resolve Hawking's information paradox by showing entanglement entropy of radiation follows a unitary Page curve via quantum extremal surfaces.

Fields: Physics, Mathematics, Information Theory, Quantum Gravity, Thermodynamics

Bekenstein (1973) proposed that a black hole of horizon area A carries entropy S_BH = kA/4l_P² (in natural units, S_BH = A/4G in Planck units). This is the maximum entropy that can be enclosed in a re...

Bridge Statistical Mechanics and Information Theory — Boltzmann entropy and Shannon entropy are formally identical; Jaynes maximum entropy derives equilibrium, Landauer links erasure to thermodynamics

Fields: Physics, Mathematics, Information Theory, Thermodynamics, Statistical Mechanics

The Boltzmann entropy S = k_B ln W and Shannon entropy H = −Σpᵢ log pᵢ are mathematically identical after substituting k_B and adjusting the logarithm base. Boltzmann counts microstates W consistent w...

Bridge Brain-state transitions between avalanche-criticality and sub/super-critical regimes mirror second-order phase transitions in condensed-matter physics.

Fields: Neuroscience, Condensed Matter Physics, Statistical Mechanics, Information Theory

Neural avalanches (cascades of activity that follow a power-law size distribution) are the biological signature of a system operating near a second-order phase transition — the same mathematical struc...

Bridge Rumour and misinformation spreading on social networks maps exactly onto bond percolation on the contact network via the SIR epidemic model — with the percolation threshold p_c → 0 for scale-free networks, meaning any viral meme can reach the giant component of social attention regardless of initial conditions.

Fields: Physics, Social Science, Network Science, Epidemiology, Information Theory

SIR RUMOUR MODEL (Daley & Kendall 1965): Individuals are Susceptible (haven't heard), Infected (spreading), Recovered (heard but no longer spreading). Rate equations: dS/dt = -βSI dI/dt = βSI - γ...

Bridge Migratory birds navigate using quantum entanglement in cryptochrome — the radical-pair mechanism is a room-temperature quantum sensor inside a living protein, operating at the precision limit set by quantum Fisher information.

Fields: Quantum Mechanics, Molecular Biology, Sensory Neuroscience, Quantum Information Theory

The magnetic compass of migratory songbirds is not a classical ferromagnetic sensor (like a compass needle) but a quantum device: photo-excited electron transfers in the flavin-adenine dinucleotide (F...

Bridge Quantum key distribution achieves information-theoretic security (unconditional security independent of adversary computing power) by exploiting quantum measurement disturbance, bridging quantum computing and cryptography through the quantum no-cloning theorem and Shannon's one-time pad.

Fields: Quantum Computing, Cryptography, Information Theory

BB84 quantum key distribution achieves information-theoretic security (proven secure against computationally unbounded adversaries) because any eavesdropping measurement on quantum states introduces d...

Bridge The quantum fault-tolerance threshold theorem connects quantum error correction to information theory: if the physical error rate per gate p is below a threshold p_th (typically ~1% for surface codes), arbitrarily long quantum computations can be performed reliably by concatenating error-correcting codes, with overhead growing only polylogarithmically in computation length.

Fields: Quantum Computing, Quantum Information Theory, Computer Science

For a concatenated code of level k with physical error rate p and threshold p_th, the logical error rate scales as p_L = p_th·(p/p_th)^{2^k}. Each level of concatenation doubles the exponent, so after...

Bridge Quantum decoherence selects pointer states through einselection: the preferred basis that survives entanglement with the environment is determined by the system-environment interaction Hamiltonian, explaining the emergence of classical reality from quantum superpositions

Fields: Quantum Physics, Information Theory

Environment-induced superselection (einselection) identifies pointer states as eigenstates of the system observable that commutes with the system-environment interaction Hamiltonian H_int, explaining ...

Bridge Quantum error-correcting codes (stabilizer codes, surface codes) and the holographic principle in quantum gravity (AdS/CFT) are the same mathematical structure: bulk operators in AdS are encoded in boundary CFT degrees of freedom via a quantum error-correcting code, with the Ryu-Takayanagi formula (S = A/4G_N) expressing entanglement entropy as a quantum error-correction redundancy statement.

Fields: Quantum Information Theory, Quantum Gravity, String Theory, Quantum Error Correction, Condensed Matter Physics

Quantum error correction encodes k logical qubits in n physical qubits with distance d (denoted [[n,k,d]]), such that any error affecting fewer than d/2 qubits can be detected and corrected. The key p...

Bridge The Ryu-Takayanagi formula equates the entanglement entropy of a boundary CFT region to the area of the minimal bulk surface divided by 4G, connecting quantum gravity geometry to quantum information theory through holography

Fields: Physics, Information Theory, Quantum Physics

The holographic entanglement entropy formula S_A = Area(gamma_A) / (4*G_N*hbar) (Ryu-Takayanagi) states that entanglement entropy of boundary region A in a CFT equals the area of the minimal bulk surf...

Bridge Cultural transmission of memes across social networks obeys Shannon's noisy channel theorem — meme fidelity, cultural drift, and the homogenising effects of mass media are quantitatively described by channel capacity, noise models, and the source-channel coding theorem from information theory.

Fields: Social Science, Information Theory, Cultural Evolution, Sociology, Communication Theory

Shannon (1948) proved that any communication channel with noise can reliably transmit information at rates up to its channel capacity C = max_{p(x)} I(X;Y), and that error rates rise exponentially abo...

Bridge Differential privacy provides an information-theoretic guarantee — epsilon bounds the log-likelihood ratio an adversary can achieve distinguishing any individual's data — creating a mathematically precise privacy-utility tradeoff that is dual to Neyman-Pearson hypothesis testing, bridging social privacy norms to information theory and statistical decision theory.

Fields: Social Science, Information Theory, Statistics, Computer Science, Privacy Law

Differential privacy (Dwork et al. 2006): a mechanism M satisfies epsilon-DP if for any adjacent datasets D, D' differing by one record: P[M(D)∈S] ≤ exp(epsilon) × P[M(D')∈S]. This is a formal guarant...

Bridge Homophily and structural segregation — the tendency of similar individuals to connect produces modular networks that are the mathematical basis of filter bubbles and information siloing

Fields: Social Science, Network Science, Sociology, Mathematics, Information Theory

Homophily — the tendency of similar individuals to form ties ("birds of a feather flock together") — is the dominant structural force shaping social networks. Measured by the assortativity coefficient...

Bridge Boltzmann's entropy S = k_B ln W and Shannon's entropy H = −Σ p_i log p_i are formally identical — thermodynamic entropy IS the Shannon information entropy of the macroscopic probability distribution over microstates.

Fields: Statistical Mechanics, Information Theory, Thermodynamics

Boltzmann's entropy S = k_B ln W (W = number of equally probable microstates) and Shannon's entropy H = −Σ p_i log p_i (probability distribution over messages) are the same mathematical object up to t...

Bridge Fluctuation theorems (Crooks, Jarzynski) connect nonequilibrium work distributions to equilibrium free energy differences, bridging stochastic thermodynamics and information theory through the mathematical identity between entropy production and relative entropy (KL divergence).

Fields: Statistical Physics, Information Theory, Thermodynamics

The Crooks fluctuation theorem exp(W/kT) = exp(DeltaF/kT) * P_R(-W)/P_F(W) and the Jarzynski equality = exp(-DeltaF/kT) establish that entropy production in nonequilibrium processes equal...

Bridge R.A. Fisher's fundamental theorem of natural selection and his Fisher information matrix in statistics are the same mathematical object — the rate of increase of mean fitness equals the population's statistical Fisher information about fitness, and this identity gives evolutionary biology the full toolkit of statistical estimation theory.

Fields: Statistics, Mathematical Statistics, Evolutionary Biology, Population Genetics, Quantum Information Theory

R.A. Fisher invented both: (a) the Fisher information matrix I(theta) in statistics (1925) — the expected curvature of the log-likelihood, whose inverse gives the Cramér-Rao lower bound on estimation ...

Bridge Maxwell's demon is resolved by Landauer's principle — erasing one bit of information dissipates at least kT ln 2 of energy, exactly linking Shannon information entropy to thermodynamic entropy and establishing the physical cost of logical irreversibility.

Fields: Thermodynamics, Computer Science, Information Theory, Statistical Mechanics

Maxwell's demon (1867): a hypothetical being that monitors individual molecules in a partitioned gas container, opening a small door to let fast molecules pass to one side and slow ones to the other. ...

Bridge RNA virus populations exist as quasispecies clouds near an error threshold defined by information theory: exceeding the critical mutation rate causes mutational meltdown, making the Eigen quasispecies equations a direct application of Shannon channel capacity to molecular evolution.

Fields: Virology, Information Theory, Evolutionary Biology

Eigen's quasispecies theory maps RNA virus evolution onto an information-theoretic error-correction problem: the master sequence is the optimal codeword, replication fidelity is the channel capacity, ...

Open Unknowns (1)

Unknown How far below Shannon entropy-rate bounds can large language models push perplexity when corpora are demonstrably nonstationary across domains and eras? u-entropy-rate-nonstationary-language-data

Active Hypotheses

Hypothesis Intrinsic motivation is operationally identical to empowerment maximisation — the brain implements a policy that maximises the channel capacity from actions to future states I(A;S'), and autonomy need frustration produces measurable reductions in action-outcome mutual information detectable from both neural signals and behavioral entropy medium
Hypothesis Bayesian model comparison via marginal likelihood P(E|M) = ∫ P(E|θ,M)P(θ|M)dθ automatically implements Occam's razor — the model evidence penalizes complexity proportional to the prior volume of unused parameter space — and this automatic penalization is formally equivalent to the minimum description length (MDL) principle and Fisher information geometry. medium
Hypothesis The information transfer rate of state-of-the-art intracortical BCIs is within a factor of 3 of the Fisher information bound set by the recorded neural population, and the primary limitation is non-stationarity rather than suboptimal decoding, predicting that adaptive decoders that track neural tuning drift will outperform fixed decoders by 2-3x in chronic implant conditions. high
Hypothesis Aesthetic preference ratings for visual and auditory stimuli follow an inverted-U function of lossless compression ratio (a computable approximation of Kolmogorov complexity K), with peak preference at intermediate compression ratios of 2–5x — the "sweet spot" — and this relationship is cross-culturally universal, replicating across at least 6 cultural groups with distinct aesthetic traditions. medium
Hypothesis A publicly accessible cross-domain bridge catalog measurably reduces the average time between independent parallel discoveries in different fields (the "Merton multiple" lag), detectable through citation network analysis comparing pre- and post-catalog publication patterns. high
Hypothesis The maximum sustainable rate of mean fitness increase in a population is bounded above by the Shannon channel capacity C = B log2(1 + S/N), where B is the effective number of independently evolving loci and S/N is the fitness variance-to-noise ratio, and this bound is approached within 2x in long-term evolution experiments. medium
Hypothesis Exact sparse signal recovery from m harmonic measurements of an s-sparse signal requires m ≥ C·s·log(n/s) measurements — and this bound is sharp up to constants — with the restricted isometry property (RIP) of random Fourier matrices achievable with high probability for m ≥ s·polylog(n). medium
Hypothesis Off-target base editing rates follow a position-dependent mismatch model with exponential rate reduction per mismatch position (weighted by distance from PAM), matching the structure of a convolutional code error probability function and enabling quantitative prediction of off-target rates from guide sequence alone. medium
Hypothesis Cross-validated damping schedules selected on synthetic loopy linkage graphs reduce switch-error rates versus fixed defaults when marker maps induce long-range dependencies. high
Hypothesis Differential privacy (epsilon, delta) is dual to hypothesis testing: epsilon controls the Type I + Type II error tradeoff for any test distinguishing adjacent datasets, and the hockey-stick divergence E_alpha = max(P(M(D)∈S) - alpha × P(M(D')∈S)) provides the tight characterization of (epsilon, delta)-DP in terms of Neyman-Pearson optimal hypothesis testing theory. high

Know something about Information Theory? Contribute an unknown or hypothesis →

Generated 2026-05-10 · USDR Dashboard