Fields: Machine Learning, Statistical Physics, Information Theory, Neuroscience
Grokking — the phenomenon where a neural network suddenly transitions from memorisation to generalisation after a long plateau — exhibits sharp, non-analytic changes in the effective dimensionality of...
Fields: Aesthetics, Cognitive Science, Information Theory, Mathematics, Music Cognition, Visual Neuroscience
Birkhoff (1933) defined aesthetic measure as M = O/C — order divided by complexity. High order with low complexity (a single constant tone, a uniform colour field) has M → ∞ but is perceived as boring...
Fields: Astronomy, Quantum Gravity, Information Theory, Quantum Error Correction
Hawking's 1974 calculation showed that black holes radiate thermally, apparently destroying the quantum information contained in infalling matter. This is the information paradox: unitary quantum mech...
Fields: Astrophysics, Information Theory, Quantum Gravity, Theoretical Physics
The discovery that black holes have entropy proportional to their surface area — not volume — is the most profound known connection between spacetime geometry and information theory. 1. Bekenstein-Haw...
Fields: Biology, Computer Science, Information Theory
Adenine base editors (ABEs) convert A-T to G-C base pairs without double-strand breaks, implementing a precise one-bit correction in the genomic information channel; the specificity window (protospace...
Fields: Biology, Computer Science, Information Theory
Gene regulatory networks face a fundamental channel capacity limit: the maximum mutual information between transcription factor concentration (input) and target gene expression (output) is bounded by ...
Fields: Biology, Computer Science, Information Theory, Evolutionary Biology
Natural selection updates the population's genetic prior toward higher fitness using the same mathematical operation as Bayesian belief updating; Fisher's fundamental theorem of natural selection is t...
Fields: Neuroscience, Computer Science, Information Theory
Retinal ganglion cell spike trains are efficient codes in the information-theoretic sense; center-surround receptive fields implement a whitening filter that removes spatial redundancy in natural imag...
Fields: Molecular Biology, Information Theory, Coding Theory, Evolutionary Biology, Genetics
Shannon's channel coding theorem (1948) establishes that for any noisy channel with capacity C = B log₂(1 + SNR), there exist codes that transmit information with arbitrarily small error probability a...
Fields: Molecular Biology, Information Theory, Computational Biology
The genetic code has 64 codons encoding 20 amino acids plus stop signals, giving ~1.5 bits of coding redundancy per codon. Synonymous codons (different codons for the same amino acid) are used non-uni...
Fields: Biology, Information Theory, Collective Behavior
Quorum sensing in bacteria: the threshold concentration S_q where gene expression switches satisfies ∂F/∂S = 0 (hill function bistability), giving a sharp collective switch at population density N > N...
Fields: Biology, Information Theory, Genomics
High-throughput pooled CRISPR experiments assign binary-like signatures to perturbations so downstream sequencing demultiplexes signals — coding theory supplies intuition about Hamming distance and re...
Fields: Biology, Information Theory, Computer Science
Kauffman (1969) modeled gene regulatory networks as Boolean networks: N genes each updated by a Boolean function of K randomly chosen inputs. For K < 2, networks freeze in ordered attractors; for K > ...
Fields: Molecular Biology, Information Theory
Schneider & Stephens (1990) showed that transcription factor binding sites can be quantified as information in bits: the information content Ri = 2 − H(position), where H is Shannon entropy over the f...
Fields: Biophysics, Information Theory, Systems Biology, Nonlinear Dynamics
In excitable and threshold-like cellular pathways, moderate noise can increase detectability of weak periodic inputs by synchronizing barrier crossings with subthreshold stimuli. This maps directly to...
Fields: Neuroscience, Cognitive Science, Information Theory, Sensory Physiology, Computational Neuroscience
Barlow (1961) proposed that the goal of sensory processing is to represent the environment using the minimum number of active neurons — equivalently, to maximize the Shannon mutual information I(stimu...
Fields: Computer Science, Mathematics, Statistical Physics, Combinatorics, Information Theory
Many NP-complete problems (3-SAT, graph coloring, random k-SAT, traveling salesman) exhibit sharp phase transitions in their typical-case hardness as a control parameter varies. In random k-SAT: let α...
Fields: Machine Learning, Statistical Physics, Computer Science, Information Theory
Energy-based models assign low energy to plausible configurations; training shapes the energy landscape so that data lie in wells. Contrastive objectives such as InfoNCE reweight logits of positive ve...
Fields: Computer Science, Theoretical Machine Learning, Statistics, Statistical Physics, Information Theory
PAC (Probably Approximately Correct) learning theory (Valiant 1984) provides a mathematical framework for when a learning algorithm can generalise from training data to unseen examples. A concept clas...
Fields: Biology, Computer_Science, Information_Theory, Molecular_Biology
DNA replication achieves an error rate of approximately 10^-9 per base through a three-stage error-correction pipeline (polymerase insertion selectivity 10^-5, 3'to5' exonuclease proofreading 10^-2, p...
Fields: Ecology, Biodiversity Science, Information Theory, Statistical Mechanics, Biogeography
Shannon's entropy H = -Σ_i p_i log p_i applied to species i with relative abundance p_i is used directly as a biodiversity index (H' or Shannon diversity), quantifying uncertainty in the species ident...
Fields: Ecology, Statistics, Information Theory, Conservation Biology, Bayesian Inference
Jaynes (1957) formulated the maximum entropy (MaxEnt) principle for statistical inference: among all probability distributions consistent with known constraints (expected values of observable features...
Fields: Economics, Information Theory, Probability Theory, Finance, Stochastic Processes
Fama (1970) defined the Efficient Market Hypothesis (EMH): asset prices fully reflect all available information. Samuelson (1965) showed that this is mathematically equivalent to the statement that pr...
Fields: Economics, Statistical Physics, Econophysics, Information Theory
Dragulescu & Yakovenko (2000) demonstrated that if economic agents exchange wealth in random pairwise interactions conserving total wealth (analogous to elastic collisions conserving energy), the stat...
Fields: Electromagnetism, Information Theory, Communications Engineering
Maxwell's equations in free space admit plane wave solutions of the form E = E₀ exp(i(k·r − ωt)), which are identical in mathematical structure to the carrier waves used in all radio, microwave, and o...
Fields: Engineering, Mathematics, Information Theory, Computer Science
Shannon's source coding theorem (1948) proves that a source with entropy H bits/ symbol can be losslessly compressed to H bits/symbol on average but not below — setting an absolute mathematical lower ...
Fields: Immunology, Physics, Information Theory, Statistical Mechanics, Mathematics
The adaptive immune system must recognize ~10¹⁵ possible foreign antigens using only ~10⁷ circulating T-cell clones (each with a distinct T-cell receptor, TCR). This is a covering problem: the T-cell ...
Fields: Information Theory, Molecular Evolution, Statistical Physics, Virology
Manfred Eigen's quasispecies theory (1971) shows that a replicating population of sequences (RNA, DNA, or proteins) undergoes a phase transition at a critical mutation rate mu_c: below mu_c, a "master...
Fields: Information Theory, Epistemology, Network Science, Cognitive Science, Library Science, Science Of Science
Shannon's channel capacity theorem (C = B log₂(1 + S/N)) provides a formal framework for the scientific knowledge overload problem. Consider each scientific domain as a transmitter and each researcher...
Fields: Information Theory, Genetics, Computer Science
Established engineering practice uses sum-product / approximate message passing algorithms on graphical models for large-scale genotype phasing and related inference tasks; residual speculative analog...
Fields: Information Theory, Molecular Biology, Genetics, Evolutionary Biology
Shannon's (1948) framework maps onto molecular genetics with striking precision. The DNA alphabet has size q = 4 (A, T, G, C), so the maximum entropy per position is log₂(4) = 2 bits. The information ...
Fields: Information Theory, Computational Linguistics, Machine Learning
Shannon–McMillan–Breiman asymptotic equipartition implies typical sequences carry ~nh bits per n symbols for ergodic processes with entropy rate h. Neural language models minimize average negative log...
Fields: Linguistics, Information Theory, Cognitive Science, Statistical Physics, Complexity Science
Zipf (1949) observed that the frequency of a word is inversely proportional to its rank in the frequency table: f(r) ∝ 1/r. This power law appears in word frequencies across all natural languages, cit...
Fields: Mathematics, Physics, Dynamical_Systems, Information_Theory
Deterministic chaos (positive Lyapunov exponents, sensitive dependence on initial conditions) is the physical manifestation of ergodic mixing in measure-preserving dynamical systems; the Kolmogorov-Si...
Fields: Mathematics, Evolutionary Biology, Information Theory, Statistics
The space of probability distributions over a discrete variable forms a Riemannian manifold equipped with the Fisher information metric g_{ij} = E[∂_i log p · ∂_j log p], where i,j index parameters of...
Fields: Linguistics, Information Theory, Mathematics, Statistical Physics, Cognitive Science
Zipf (1935, 1949) documented that in any natural language corpus the r-th most frequent word has frequency f_r ≈ C / r (Zipf's law, exponent α = 1 exactly). He proposed a "principle of least effort": ...
Fields: Mathematics, Neuroscience, Cognitive Science, Statistics, Information Theory
The predictive coding framework (Rao & Ballard 1999) proposes that cortical processing is bidirectional: top-down connections carry predictions x̂_L = f(x_{L+1}) from higher to lower levels, while bot...
Fields: Neuroscience, Engineering, Neural Engineering, Information Theory, Signal Processing
BCIs decode intended movement from neural population activity recorded by electrode arrays. Linear decoding: ŷ = Wx + b where x ∈ R^N is the spike rate vector from N neurons, y is decoded kinematics (...
Fields: Neuroscience, Information Theory, Sensory Physiology, Computational Neuroscience
The nervous system encodes stimuli as spike trains — discrete all-or-none action potentials — which can be analysed as Shannon communication channels. The channel capacity C = B log₂(1 + S/N) bounds t...
Fields: Neuroscience, Information Theory, Cognitive Science, Psychology
Ryan and Deci (2000, 27 k citations) established that intrinsic motivation, competence, and autonomy are fundamental psychological needs whose satisfaction predicts well-being. Information theory and ...
Fields: Neuroscience, Mathematics, Information Theory
IIT (Tononi 2004, 2014) defines Φ as the minimum information generated by a system as a whole beyond its minimum information partition (MIP). Mathematically, Φ is a measure over a causal structure (di...
Fields: Theoretical Neuroscience, Cognitive Science, Statistical Physics, Thermodynamics, Information Theory
The thermodynamic free energy in statistical mechanics is F = U - TS, where U is internal energy, T is temperature, and S is entropy. A system at equilibrium minimises F, which is equivalent to maximi...
Fields: Neuroscience, Psychophysics, Physics, Information Theory, Sensory Biology, Cognitive Science
Weber's law (1834): the just noticeable difference ΔS for a stimulus of intensity S is proportional to S: ΔS/S = k (Weber fraction, constant per modality). For brightness, k ≈ 0.02; for weight, k ≈ 0....
Fields: Neuroscience, Signal Processing, Information Theory
The problem of decoding motor intent from neural population activity is an optimal state estimation problem: spike trains from N neurons encode a low-dimensional movement state x(t) with Fisher inform...
Fields: Philosophy Of Science, Information Theory, Mathematics, Statistics, Machine Learning
Kolmogorov (1965) defined the complexity K(x) of a string x as the length (in bits) of the shortest program on a universal Turing machine U that outputs x and halts. Solomonoff (1964) independently de...
Fields: Statistical Physics, Machine Learning, Information Theory
Deep neural networks undergo a series of phenomena that are strikingly described by the language of statistical physics phase transitions: 1. **Grokking (Power et al. 2022)**: a model trains to 100% t...
Fields: Physics, Thermodynamics, Information Theory, Cognitive Science, Consciousness Studies, Neuroscience
Integrated information theory (IIT; Tononi 2004) defines consciousness as Φ, the amount of irreducible integrated information: the effective information generated by the whole system above and beyond ...
Fields: Physics, Computer Science, Information Theory
Lossy data compression (JPEG, MP3, rate-distortion theory) and the renormalization group (integrating out short-scale fluctuations) both perform optimal coarse- graining: both discard information that...
Fields: Physics, Computer Science, Information Theory
Boltzmann's thermodynamic entropy S = k_B ln Omega and Shannon's information entropy H = -sum p_i log p_i are the same mathematical object; physical heat dissipation and information erasure are two fa...
Fields: Statistical Mechanics, Macroecology, Information Theory, Biodiversity Science
Jaynes (1957) showed that the Boltzmann-Gibbs distribution is the unique probability distribution that maximizes Shannon entropy subject to known macroscopic constraints (e.g. fixed mean energy). Hart...
Fields: Economics, Computer Science, Information Theory
Sims' rational inattention model formalizes attention as a scarce cognitive resource with Shannon mutual information as the cost; optimal attention allocation under entropy cost produces price stickin...
Fields: Quantum Physics, Microwave Engineering, Electrical Engineering, Information Theory
Caves derived that a linear phase-preserving amplifier with large gain must introduce noise equivalent to at least half a quantum at the input port when referenced against the signal quadrature, trans...
Fields: Physics, Computer Engineering, Thermodynamics, Neuromorphic Computing, Information Theory
Landauer's principle (1961) establishes that logically irreversible operations — those that erase information — must dissipate at least k_BT ln 2 ≈ 3×10⁻²¹ J per bit at room temperature into the envir...
Fields: Thermodynamics, Information Theory, Cosmology, Statistical Mechanics
Three apparently separate arrows of time — thermodynamic (entropy increases), computational (Landauer: erasing one bit dissipates at least k_B T ln 2 of heat), and cosmological (the universe began in ...
Fields: Thermodynamics, Information Theory, Statistical Physics, Computer Science
Landauer (1961) proved that erasing one bit of information in a thermal environment at temperature T requires dissipating at least k_B * T * ln(2) of free energy as heat — approximately 3 zJ at room t...
Fields: Mathematics, Physics, Information_Theory, Dynamical_Systems
The Renyi entropy of order q, H_q = (1/(1-q)) log sum_i p_i^q, generates the full multifractal spectrum f(alpha) via Legendre transform tau(q) -> f(alpha); turbulent velocity fields, strange attractor...
Fields: Physics, Mathematics, Information Theory, Quantum Gravity, Thermodynamics
Bekenstein (1973) proposed that a black hole of horizon area A carries entropy S_BH = kA/4l_P² (in natural units, S_BH = A/4G in Planck units). This is the maximum entropy that can be enclosed in a re...
Fields: Physics, Mathematics, Information Theory, Thermodynamics, Statistical Mechanics
The Boltzmann entropy S = k_B ln W and Shannon entropy H = −Σpᵢ log pᵢ are mathematically identical after substituting k_B and adjusting the logarithm base. Boltzmann counts microstates W consistent w...
Fields: Neuroscience, Condensed Matter Physics, Statistical Mechanics, Information Theory
Neural avalanches (cascades of activity that follow a power-law size distribution) are the biological signature of a system operating near a second-order phase transition — the same mathematical struc...
Fields: Physics, Social Science, Network Science, Epidemiology, Information Theory
SIR RUMOUR MODEL (Daley & Kendall 1965): Individuals are Susceptible (haven't heard), Infected (spreading), Recovered (heard but no longer spreading). Rate equations: dS/dt = -βSI dI/dt = βSI - γ...
Fields: Quantum Mechanics, Molecular Biology, Sensory Neuroscience, Quantum Information Theory
The magnetic compass of migratory songbirds is not a classical ferromagnetic sensor (like a compass needle) but a quantum device: photo-excited electron transfers in the flavin-adenine dinucleotide (F...
Fields: Quantum Computing, Cryptography, Information Theory
BB84 quantum key distribution achieves information-theoretic security (proven secure against computationally unbounded adversaries) because any eavesdropping measurement on quantum states introduces d...
Fields: Quantum Computing, Quantum Information Theory, Computer Science
For a concatenated code of level k with physical error rate p and threshold p_th, the logical error rate scales as p_L = p_th·(p/p_th)^{2^k}. Each level of concatenation doubles the exponent, so after...
Fields: Quantum Physics, Information Theory
Environment-induced superselection (einselection) identifies pointer states as eigenstates of the system observable that commutes with the system-environment interaction Hamiltonian H_int, explaining ...
Fields: Quantum Information Theory, Quantum Gravity, String Theory, Quantum Error Correction, Condensed Matter Physics
Quantum error correction encodes k logical qubits in n physical qubits with distance d (denoted [[n,k,d]]), such that any error affecting fewer than d/2 qubits can be detected and corrected. The key p...
Fields: Physics, Information Theory, Quantum Physics
The holographic entanglement entropy formula S_A = Area(gamma_A) / (4*G_N*hbar) (Ryu-Takayanagi) states that entanglement entropy of boundary region A in a CFT equals the area of the minimal bulk surf...
Fields: Social Science, Information Theory, Cultural Evolution, Sociology, Communication Theory
Shannon (1948) proved that any communication channel with noise can reliably transmit information at rates up to its channel capacity C = max_{p(x)} I(X;Y), and that error rates rise exponentially abo...
Fields: Social Science, Information Theory, Statistics, Computer Science, Privacy Law
Differential privacy (Dwork et al. 2006): a mechanism M satisfies epsilon-DP if for any adjacent datasets D, D' differing by one record: P[M(D)∈S] ≤ exp(epsilon) × P[M(D')∈S]. This is a formal guarant...
Fields: Social Science, Network Science, Sociology, Mathematics, Information Theory
Homophily — the tendency of similar individuals to form ties ("birds of a feather flock together") — is the dominant structural force shaping social networks. Measured by the assortativity coefficient...
Fields: Statistical Mechanics, Information Theory, Thermodynamics
Boltzmann's entropy S = k_B ln W (W = number of equally probable microstates) and Shannon's entropy H = −Σ p_i log p_i (probability distribution over messages) are the same mathematical object up to t...
Fields: Statistical Physics, Information Theory, Thermodynamics
The Crooks fluctuation theorem exp(W/kT) = exp(DeltaF/kT) * P_R(-W)/P_F(W) and the Jarzynski equality
Fields: Statistics, Mathematical Statistics, Evolutionary Biology, Population Genetics, Quantum Information Theory
R.A. Fisher invented both: (a) the Fisher information matrix I(theta) in statistics (1925) — the expected curvature of the log-likelihood, whose inverse gives the Cramér-Rao lower bound on estimation ...
Fields: Thermodynamics, Computer Science, Information Theory, Statistical Mechanics
Maxwell's demon (1867): a hypothetical being that monitors individual molecules in a partitioned gas container, opening a small door to let fast molecules pass to one side and slow ones to the other. ...
Fields: Virology, Information Theory, Evolutionary Biology
Eigen's quasispecies theory maps RNA virus evolution onto an information-theoretic error-correction problem: the master sequence is the optimal codeword, replication fidelity is the channel capacity, ...
Know something about Information Theory? Contribute an unknown or hypothesis →
Generated 2026-05-10 · USDR Dashboard