Algorithms, computation, and AI
Fields: Machine Learning, Statistical Physics, Information Theory, Neuroscience
Grokking — the phenomenon where a neural network suddenly transitions from memorisation to generalisation after a long plateau — exhibits sharp, non-analytic changes in the effective dimensionality of...
Fields: Machine Learning, Statistical Physics, Condensed Matter Physics
The renormalization group (RG) in statistical physics is a systematic procedure for integrating out short-scale degrees of freedom while preserving long-wavelength behavior, flowing toward fixed point...
Fields: Physics, Biology, Neuroscience, Computer Science, Social Science, Philosophy Of Science, Complex Systems, Mathematics
Anderson's "More is Different" (1972): each level of organisation obeys its own laws not derivable from — though consistent with — lower levels. Formal definition of emergence (Bedau 1997): a system S...
Fields: Aesthetics, Cognitive Science, Information Theory, Mathematics, Music Cognition, Visual Neuroscience
Birkhoff (1933) defined aesthetic measure as M = O/C — order divided by complexity. High order with low complexity (a single constant tone, a uniform colour field) has M → ∞ but is perceived as boring...
Fields: Cosmology, Computational Astrophysics, Computer Science, Algorithms
Simulation post-processing tracks bound substructures across snapshots, assigning parent–child merge events with heuristic linking rules and uncertainty when disruptive tidal stripping fragments ident...
Fields: Astronomy, Quantum Gravity, Information Theory, Quantum Error Correction
Hawking's 1974 calculation showed that black holes radiate thermally, apparently destroying the quantum information contained in infalling matter. This is the information paradox: unitary quantum mech...
Fields: Astronomy, Machine Learning, Space Physics
Speculative analogy (to be empirically validated): Neural-operator surrogates for coupled plasma dynamics can be integrated into sequential data-assimilation loops similarly to reduced-order forecast ...
Fields: Astrophysics, Information Theory, Quantum Gravity, Theoretical Physics
The discovery that black holes have entropy proportional to their surface area — not volume — is the most profound known connection between spacetime geometry and information theory. 1. Bekenstein-Haw...
Fields: Biology, Computer Science
Foraging ants deposit pheromone tau_ij on edges (i,j) of a complete graph proportional to path quality (1/L_k), and choose edges probabilistically as p_{ij} = tau_ij^alpha * eta_ij^beta / sum(tau_il^a...
Fields: Biology, Computer Science, Complex Systems, Evolutionary Biology
Ant colonies solve the traveling salesman problem without central control: foragers deposit pheromone on paths, and shorter paths accumulate pheromone faster (more round trips per unit time), positive...
Fields: Molecular Biology, Genomics, Computer Science, Bioinformatics
CRISPR-Cas9 is a programmable biological search-and-replace algorithm operating on the genome as a character string. The guide RNA (gRNA, ~20 nucleotides) is the search pattern; Cas9 protein is the en...
Fields: Biology, Nanotechnology, Computer Science
The bridge is a labeled metaphor for design practice, not a mechanistic equivalence. Scaffold path constraints, staple crossovers, and annealing schedules can be described like dependency graphs and s...
Fields: Biology, Computer Science, Physics
Reynolds (1987) showed that realistic flocking arises from three steering behaviours: avoid crowding (separation), steer toward average heading (alignment), steer toward average position (cohesion). T...
Fields: Biology, Computer Science, Systems Biology, Developmental Biology
Boolean network models (Kauffman 1969): genes are binary nodes (on/off), each receiving K regulatory inputs and computing a Boolean function of those inputs. The entire N-gene network is a finite dete...
Fields: Theoretical Biology, Computer Science, Systems Biology
In RBNs each gene updates as a Boolean function of K regulators; for random ensembles the average influence determines whether dynamics freeze into attractors (ordered), wander ergodically (chaotic), ...
Fields: Systems Biology, Computer Science, Mathematics
Stuart Kauffman's Boolean network model assigns each gene a Boolean function of its regulators; finding the attractors (stable gene expression states) of a Boolean regulatory network with N genes and ...
Fields: Cell Biology, Computer Science
A signal transduction network can be abstracted as a Boolean network: each protein is a node (active=1, inactive=0) whose state is updated by a logical rule derived from biochemical interactions. Fixe...
Fields: Biology, Computer Science, Molecular Biology
Speculative analogy: Attention-based sequence modeling can encode long-range residue dependencies relevant to protein fitness landscapes....
Fields: Biology, Computer_Science, Optimization, Biophysics
E. coli chemotaxis (biased random walk toward chemical attractants via run-and-tumble motion) implements stochastic gradient ascent on the chemoattractant concentration field; the methylation-based me...
Fields: Biology, Computer_Science, Engineering
Biological muscle-tendon units (series elastic actuators) store and release elastic energy during locomotion, reducing metabolic cost below that predicted by rigid-body models; soft robotic actuators ...
Fields: Biology, Computer_Science
The transcription-translation feedback loop (TTFL) of circadian clocks (CLOCK-BMAL1/PER-CRY) is a biological relaxation oscillator whose period is set by protein degradation time constants; it is math...
Fields: Biology, Computer Science, Information Theory
Adenine base editors (ABEs) convert A-T to G-C base pairs without double-strand breaks, implementing a precise one-bit correction in the genomic information channel; the specificity window (protospace...
Fields: Biology, Computer Science, Molecular Biology
CRISPR-Cas9 genome editing performs exact string matching (PAM-adjacent target search) and substitution (cut-and-repair) on a 3-billion-character string (the human genome); guide RNA specificity follo...
Fields: Biology, Computer Science, Information Theory
Gene regulatory networks face a fundamental channel capacity limit: the maximum mutual information between transcription factor concentration (input) and target gene expression (output) is bounded by ...
Fields: Biology, Computer_Science
Transcription factor combinatorics implement Boolean logic: cooperative binding is AND, competitive binding is NOT, and OR gates arise from redundant enhancers; Kauffman's NK random Boolean network mo...
Fields: Biology, Computer_Science, Immunology, Machine_Learning
The adaptive immune system's negative selection process (deleting T-cells that recognize self-antigens in the thymus) is computationally equivalent to one-class classification and anomaly detection; t...
Fields: Biology, Computer Science, Information Theory, Evolutionary Biology
Natural selection updates the population's genetic prior toward higher fitness using the same mathematical operation as Bayesian belief updating; Fisher's fundamental theorem of natural selection is t...
Fields: Neuroscience, Computer_Science, Biology
Spike-timing dependent plasticity (STDP) implements a temporal Hebbian learning rule: synapses strengthen when pre-synaptic spikes precede post-synaptic spikes (causal), and weaken for reverse order; ...
Fields: Neuroscience, Computer Science, Information Theory
Retinal ganglion cell spike trains are efficient codes in the information-theoretic sense; center-surround receptive fields implement a whitening filter that removes spatial redundancy in natural imag...
Fields: Biology, Computer_Science, Complex_Systems, Distributed_Systems
Ant colony optimization (ACO) and honeybee swarm decision-making implement distributed consensus algorithms without central coordination; pheromone reinforcement in ACO is distributed gradient ascent ...
Fields: Molecular Biology, Information Theory, Coding Theory, Evolutionary Biology, Genetics
Shannon's channel coding theorem (1948) establishes that for any noisy channel with capacity C = B log₂(1 + SNR), there exist codes that transmit information with arbitrarily small error probability a...
Fields: Molecular Biology, Information Theory, Computational Biology
The genetic code has 64 codons encoding 20 amino acids plus stop signals, giving ~1.5 bits of coding redundancy per codon. Synonymous codons (different codons for the same amino acid) are used non-uni...
Fields: Biology, Information Theory, Collective Behavior
Quorum sensing in bacteria: the threshold concentration S_q where gene expression switches satisfies ∂F/∂S = 0 (hill function bistability), giving a sharp collective switch at population density N > N...
Fields: Biology, Information Theory, Genomics
High-throughput pooled CRISPR experiments assign binary-like signatures to perturbations so downstream sequencing demultiplexes signals — coding theory supplies intuition about Hamming distance and re...
Fields: Biology, Information Theory, Computer Science
Kauffman (1969) modeled gene regulatory networks as Boolean networks: N genes each updated by a Boolean function of K randomly chosen inputs. For K < 2, networks freeze in ordered attractors; for K > ...
Fields: Molecular Biology, Information Theory
Schneider & Stephens (1990) showed that transcription factor binding sites can be quantified as information in bits: the information content Ri = 2 − H(position), where H is Shannon entropy over the f...
Fields: Biology, Machine Learning, Systems Biology
Speculative analogy (to be empirically validated): Message passing over learned gene graphs can act as a computational analogue to mechanistic regulatory propagation assumptions used in perturbation-r...
Fields: Biology, Computer Science, Synthetic Biology
Synthetic gene circuits implement Boolean logic (toggle switches, oscillators, band-pass filters) using the same design principles as electronic circuits; the repressilator (three-gene ring oscillator...
Fields: Biology, Mathematics, Evolutionary Biology, Game Theory, Population Genetics, Machine Learning
The replicator equation, derived independently in evolutionary biology, game theory, and learning theory, is: ẋᵢ = xᵢ (fᵢ(x) - f̄(x)) where xᵢ is the frequency of strategy i, fᵢ(x) = Σⱼ aᵢⱼ xⱼ is ...
Fields: Biology, Physics, Biochemistry, Statistical Mechanics, Computer Science
Protein folding is a search on a high-dimensional energy landscape E(conformation). The "funnel" landscape hypothesis (Bryngelson & Wolynes 1987): native proteins have evolved funneled energy landscap...
Fields: Biophysics, Information Theory, Systems Biology, Nonlinear Dynamics
In excitable and threshold-like cellular pathways, moderate noise can increase detectability of weak periodic inputs by synchronizing barrier crossings with subthreshold stimuli. This maps directly to...
Fields: Biostatistics, Machine Learning, Medicine
Speculative analogy (to be empirically validated): Monte Carlo dropout predictive uncertainty can inform adaptive stopping boundaries similarly to posterior predictive criteria in Bayesian trial monit...
Fields: Chemistry, Molecular Biology, Computer Science
Speculative analogy: Predicted structure-confidence patterns can serve as priors for pruning enzyme design search spaces before expensive wet-lab screening....
Fields: Chemistry, Computer Science
Speculative analogy: Docking search strategies can use funnel-ruggedness diagnostics from energy-landscape theory to avoid overcommitting to shallow local minima during pose exploration....
Fields: Chemistry, Computer_Science, Mathematics
Chemical reaction networks (CRNs) are exactly Petri nets: species are places, reactions are transitions, stoichiometric coefficients are arc weights, and concentration dynamics are token flows; Petri ...
Fields: Chemistry, Machine Learning, Materials Science
Speculative analogy (to be empirically validated): VAE latent manifolds can compress catalyst structural descriptors into smooth generative coordinates that support guided exploration of activity-sele...
Fields: Chemistry, Computer Science, Mathematics
Soloveichik et al. (2008) proved that stochastic CRNs are Turing-complete: given arbitrary initial molecule counts, a finite CRN can simulate any register machine and hence compute any computable func...
Fields: Climate Science, Machine Learning, Statistics
Speculative analogy (to be empirically validated): Reverse-diffusion sampling can act as a controllable stochastic refinement operator analogous to ensemble post-processing used to downscale and debia...
Fields: Neuroscience, Cognitive Science, Information Theory, Sensory Physiology, Computational Neuroscience
Barlow (1961) proposed that the goal of sensory processing is to represent the environment using the minimum number of active neurons — equivalently, to maximize the Shannon mutual information I(stimu...
Fields: Cognitive Science, Linguistics, Computer Science
The cosine similarity between word vectors trained on large corpora predicts human semantic similarity ratings (Pearson r ~ 0.8) and word association norms, because both reflect the co-occurrence stat...
Fields: Cognitive Science, Physics, Neuroscience, Machine Learning, Thermodynamics, Theoretical Biology
Friston (2010) proposed that all biological self-organisation can be understood as the minimisation of variational free energy F, where: F = E_q[log q(s)] − E_q[log p(s,o)] = KL[q(s) || p(s|o)]...
Fields: Computer Science, Biology, Mathematics, Evolutionary Theory
Holland's genetic algorithm (1975) implements natural selection on populations of candidate solutions: selection (fitness proportionate reproduction), crossover (genetic recombination), and mutation (...
Fields: Computer Science, Economics, Game Theory, Network Science, Mechanism Design
CLASSICAL PROBLEM: Internet protocols (BGP routing, TCP congestion control) are designed for cooperative agents, but actual Internet is composed of self-interested autonomous systems (ASes) that may d...
Fields: Computer Science, Mathematics, Combinatorial Optimization, Convex Optimization, Complexity Theory, Graph Theory
SDP generalizes linear programming: minimize Tr(CX) subject to linear matrix inequalities A_i·X = b_i and X ≽ 0 (positive semidefinite). X ≽ 0 replaces the linear constraint x_i ∈ [0,1] (LP relaxation...
Fields: Computer Science, Mathematics, Complex Systems
A cellular automaton is computationally universal if it can simulate any Turing machine: Wolfram's Rule 110 (a 1D elementary CA) is Turing complete (Cook, 2004), and Conway's Game of Life implements l...
Fields: Computer Science, Mathematics, Statistical Physics, Combinatorics, Information Theory
Many NP-complete problems (3-SAT, graph coloring, random k-SAT, traveling salesman) exhibit sharp phase transitions in their typical-case hardness as a control parameter varies. In random k-SAT: let α...
Fields: Computer Science, Mathematics
Rule 110 is a one-dimensional cellular automaton (1D CA) with 2 states and a specific local rule. Cook (2004) proved it is Turing-complete: it can simulate any Turing machine. This means no algorithm ...
Fields: Computer Science, Mathematics, Numerical Analysis
Forward inference solves z = f(z) via root-finding or fixed-point iteration; reverse-mode derivatives apply the implicit function theorem (I − J)^{-1} structure analogous to adjoint sensitivity analys...
Fields: Computer Science, Mathematics
Dung's abstract argumentation framework AF = (AR, attacks) maps legal arguments to nodes and legal rebuttals/undercutters to directed edges, with grounded, preferred, and stable extension semantics pr...
Fields: Computer Science, Mathematics, Statistical Learning Theory
PAC (Probably Approximately Correct) learning: a hypothesis class H is ε-δ PAC-learnable if for all ε,δ > 0 there exists a sample complexity m ≥ (1/ε)[ln|H| + ln(1/δ)] (finite H) such that with probab...
Fields: Mathematics, Computer Science, Cryptography
The NFS algorithm for factoring n applies algebraic number theory (number fields with rings of integers, ideal factorization in class groups) to the combinatorial sieve: it finds pairs (a,b) such that...
Fields: Mathematics, Combinatorics, Computer Science, Algorithm Design, Probability Theory
The probabilistic method (Erdős 1947): to prove that a combinatorial object with property P exists, construct a suitable probability space, show the random object lacks property P with probability < 1...
Fields: Computer Science, Mathematics, Statistical Physics, Combinatorics
A random 3-SAT instance with n variables and m = αn clauses (each clause containing 3 random variables in random polarity) undergoes a sharp phase transition at critical ratio α_c ≈ 4.267 (Kirkpatrick...
Fields: Computer Science, Mathematics
The Curry-Howard correspondence (Curry 1934, Howard 1980) reveals a deep structural identity between formal logic and type theory in programming languages: propositions correspond to types, proofs cor...
Fields: Machine Learning, Neuroscience, Computational Neuroscience
Attention weights are a_ij = softmax_j(q_i · k_j / √d): nonnegative, sum-to-one over j for fixed i, resembling a divisive normalization across locations/channels after an expansive nonlinearity (exp)....
Fields: Computer Science, Neuroscience, Cognitive Science, Machine Learning, Computational Neuroscience
The transformer attention mechanism (Vaswani et al. 2017): Attention(Q, K, V) = softmax(QKᵀ / √d_k) V operates on queries Q, keys K, and values V. Each output position attends to all input positio...
Fields: Computer Science, Statistical Physics
Random k-SAT and related NP-hard combinatorial optimization problems undergo a sharp phase transition at a critical clause-to-variable ratio α_c where the fraction of satisfiable instances drops from ...
Fields: Computer Science, Physics, Dynamical Systems
Established data-driven method (EDMD) approximates Koopman eigenfunctions from trajectory dictionaries; speculative analogy for video—learned linear evolution in lifted feature spaces may forecast sho...
Fields: Computer Science, Physics, Quantum Information, Computational Complexity
Classical computational complexity: the class BPP (bounded-error probabilistic polynomial time) captures what classical computers can efficiently compute. BQP (bounded-error quantum polynomial time) a...
Fields: Computer Science, Physics, Quantum Computing, Computational Complexity, Quantum Information
Google's 53-qubit Sycamore processor (Arute et al. 2019) sampled the output distribution of a pseudo-random quantum circuit in 200s, with classical simulation estimated at 10,000 years on Summit super...
Fields: Machine Learning, Statistical Physics, Computer Science, Information Theory
Energy-based models assign low energy to plausible configurations; training shapes the energy landscape so that data lie in wells. Contrastive objectives such as InfoNCE reweight logits of positive ve...
Fields: Computer Science, Combinatorial Optimization, Statistical Mechanics, Thermodynamics
Kirkpatrick et al. (1983) introduced simulated annealing by recognising that combinatorial optimization problems are formally equivalent to finding the ground state of a physical system. The acceptanc...
Fields: Computer Science, Theoretical Machine Learning, Statistics, Statistical Physics, Information Theory
PAC (Probably Approximately Correct) learning theory (Valiant 1984) provides a mathematical framework for when a learning algorithm can generalise from training data to unseen examples. A concept clas...
Fields: Computer Science, Statistics, Machine Learning, Computational Physics
Parallel tempering mitigates trapping in rugged posterior landscapes by swapping chains across temperature levels. The method is established in molecular simulation and increasingly relevant for Bayes...
Fields: Statistics, Computer Science, Machine Learning, Applied Mathematics
Ordinary least squares minimizes squared error; adding an L2 penalty pulls coefficients toward zero, stabilizing ill-conditioned designs by trading bias for variance. Equivalently, with Gaussian likel...
Fields: Computer Vision, Radiology, Signal Processing
Speculative analogy: Restricted-measurement sparse recovery theory can guide MRI acquisition schedules that preserve clinically relevant structure at lower scan times....
Fields: Computer Science, Medicine, Ophthalmology
Speculative analogy: Residual skip pathways mitigate optimization degradation in medical image classifiers and can improve robustness in retinal screening workflows....
Fields: Critical Care, Machine Learning, Stochastic Processes
Speculative analogy (to be empirically validated): neural CDEs translate irregularly sampled physiologic streams into continuous control paths, mirroring how rough-path summaries preserve temporal sig...
Fields: Biology, Computer_Science, Information_Theory, Molecular_Biology
DNA replication achieves an error rate of approximately 10^-9 per base through a three-stage error-correction pipeline (polymerase insertion selectivity 10^-5, 3'to5' exonuclease proofreading 10^-2, p...
Fields: Computer Science, Biology, Evolutionary Biology
Genetic algorithms (mutation, crossover, selection on fitness) are a direct mathematical abstraction of natural selection; Holland's schema theorem proves that GAs implicitly sample an exponential num...
Fields: Computer Science, Biology, Evolutionary Biology
Neural architecture search (NAS) algorithms - NEAT, evolutionary NAS, AmoebaNet - mimic biological evolution: networks are organisms, architectures are genotypes, validation accuracy is fitness, and m...
Fields: Computer_Science, Neuroscience, Mathematics
Visual cortex V1 simple cells learn sparse overcomplete representations of natural images (Olshausen & Field 1996) that are equivalent to dictionary learning in compressed sensing; the cortex solves a...
Fields: Economics, Computer_Science, Mathematics, Cryptography
Cryptographic protocol security (no computationally bounded adversary can profitably deviate) is a Nash equilibrium condition in a game where parties are rational agents maximizing expected utility; r...
Fields: Economics, Computer Science, Mathematics
Mechanism design (designing rules so truthful reporting is the dominant strategy) and competitive market equilibrium (where no agent can profitably deviate) are dual formulations of the same incentive...
Fields: Computer Science, Physics, Mathematics
The satisfiability phase transition (SAT/UNSAT boundary near clause-to-variable ratio alpha approximately 4.27 for 3-SAT) coincides with a spin-glass phase transition in the random K-SAT energy landsc...
Fields: Computer Science, Mathematics, Signal Processing
Compressed sensing proves that a sparse signal in R^n can be exactly recovered from O(k log n) random linear measurements (far fewer than n) by L1 minimization; this connects the restricted isometry p...
Fields: Computer Science, Mathematics, Machine Learning
Graph convolutional networks perform convolution in the spectral domain of the graph Laplacian; filters are polynomials of eigenvalues (spectral filters), and message passing is equivalent to diffusio...
Fields: Computer_Science, Mathematics, Linear_Algebra, Probability
Google's PageRank algorithm computes the stationary distribution of a random walk on the web graph with teleportation probability alpha; this is exactly the left eigenvector of the Google matrix G = a...
Fields: Computer_Science, Mathematics, Control_Theory, Optimization
Reinforcement learning (Q-learning, policy gradients, TD-learning) solves the Bellman optimality equation V*(s) = max_a [R(s,a) + gamma E[V*(s')]] via function approximation; this connects RL to Bellm...
Fields: Computer_Science, Mathematics
Arc consistency algorithms (AC-3) in constraint satisfaction problems perform the same logical deduction as unit propagation in DPLL SAT solvers; both compute the fixpoint of a constraint propagation ...
Fields: Computer_Science, Mathematics, Network Science
Social network centrality measures (PageRank, Katz centrality, eigenvector centrality, HITS) are all variants of the dominant eigenvector of the adjacency or transition matrix; the attenuation factor ...
Fields: Computer_Science, Mathematics
Spectral clustering finds community structure by computing eigenvectors of the graph Laplacian L = D - A; the Fiedler vector (second smallest eigenvector) bisects the graph at minimum cut, and k eigen...
Fields: Computer Science, Physics, Complexity Science
Conway's Game of Life and Wolfram's Rule 110 one-dimensional cellular automaton are Turing-complete; the capacity for universal computation emerges from simple local rules without central coordination...
Fields: Computer_Science, Mathematics, Dynamical_Systems, Machine_Learning
Neural ordinary differential equations (Chen et al. 2018) define network depth as continuous time in an ODE system dh/dt = f(h,t,theta); the network learns a vector field whose flow map transforms inp...
Fields: Physics, Computer_Science
Matrix product states (MPS) and tensor network contractions provide an efficient classical representation of quantum many-body states with limited entanglement; the DMRG algorithm is a tensor network ...
Fields: Dynamical Systems, Critical Care, Signal Processing
Speculative analogy: Delay-embedding reconstructions can transfer from nonlinear dynamics to ICU deterioration early-warning indicators....
Fields: Computer Science, Critical Care, Physiology
Speculative analogy: LSTM gating provides a sequence-memory abstraction that can capture delayed physiological interactions in ICU time-series forecasting....
Fields: Ecology, Computer Science, Statistical Physics
Increasing noise η in Vicsek models destroys orientational order beyond critical η_c analogous (qualitatively) to consensus latency rising until leader election thrashes — topological versus metric ne...
Fields: Ecology, Biodiversity Science, Information Theory, Statistical Mechanics, Biogeography
Shannon's entropy H = -Σ_i p_i log p_i applied to species i with relative abundance p_i is used directly as a biodiversity index (H' or Shannon diversity), quantifying uncertainty in the species ident...
Fields: Ecology, Machine Learning, Agriculture
Speculative analogy (to be empirically validated): Transformer attention over multi-scale canopy imagery can act as a surrogate for agronomic context integration used to infer emergent crop stress pat...
Fields: Ecology, Statistics, Information Theory, Conservation Biology, Bayesian Inference
Jaynes (1957) formulated the maximum entropy (MaxEnt) principle for statistical inference: among all probability distributions consistent with known constraints (expected values of observable features...
Fields: Mathematics, Computer_Science
The Wasserstein distance (earth mover's distance) from optimal transport theory provides a geometrically meaningful metric on probability distributions that captures spatial structure; Wasserstein GAN...
Fields: Economics, Information Theory, Probability Theory, Finance, Stochastic Processes
Fama (1970) defined the Efficient Market Hypothesis (EMH): asset prices fully reflect all available information. Samuelson (1965) showed that this is mathematically equivalent to the statement that pr...
Fields: Economics, Machine Learning, Statistics
Speculative analogy (to be empirically validated): Causal forests can operationalize localized elasticity estimation similarly to structural policy analyses that segment populations by marginal respon...
Fields: Economics, Computer Science, Mathematics
Computing the optimal (revenue-maximizing) mechanism for multi-item auctions with multiple bidders is NP-hard in general (Conitzer & Sandholm 2002); this hardness result explains why real-world auctio...
Fields: Economics, Mathematics, Political Science, Computer Science
Arrow's impossibility theorem (1951) proves: any social welfare function on ≥3 alternatives satisfying unanimity (Pareto efficiency) and independence of irrelevant alternatives (IIA) must be dictatori...
Fields: Economics, Mathematics, Computer Science, Game Theory
The central problem of mechanism design: how to aggregate private information (valuations, preferences) from self-interested agents into collective decisions (allocations, prices) without the agents h...
Fields: Economics, Statistical Physics, Econophysics, Information Theory
Dragulescu & Yakovenko (2000) demonstrated that if economic agents exchange wealth in random pairwise interactions conserving total wealth (analogous to elastic collisions conserving energy), the stat...
Fields: Electrical Engineering, Computer Science
Speculative analogy: PMU streams are graph signals on transmission topology, so graph-wavelet energy can isolate localized disturbances faster than nodewise threshold alarms....
Fields: Electromagnetism, Information Theory, Communications Engineering
Maxwell's equations in free space admit plane wave solutions of the form E = E₀ exp(i(k·r − ωt)), which are identical in mathematical structure to the carrier waves used in all radio, microwave, and o...
Fields: Engineering, Computer Science, Distributed Systems, Mathematics, Fault Tolerance, Blockchain
Fischer-Lynch-Paterson (FLP) impossibility (1985): in an asynchronous system where messages may be delayed arbitrarily and at least one process may fail silently, no deterministic algorithm can guaran...
Fields: Engineering, Machine Learning, Power Systems
Speculative analogy (to be empirically validated): Graph-transformer attention can approximate contingency ranking functions similarly to fast security-assessment heuristics derived from network sensi...
Fields: Engineering, Operations Research, Mathematics, Graph Theory, Combinatorial Optimization, Computer Science
Graph algorithms represent one of the most direct translations of mathematical theory into engineering practice: Shortest path: Dijkstra (1959) — O(E log V) with binary heap for non-negative edge weig...
Fields: Engineering, Mathematics, Information Theory, Computer Science
Shannon's source coding theorem (1948) proves that a source with entropy H bits/ symbol can be losslessly compressed to H bits/symbol on average but not below — setting an absolute mathematical lower ...
Fields: Engineering, Mathematics, Optimization, Convex Analysis, Machine Learning
Gradient descent x_{t+1} = x_t - η∇f(x_t) converges at rate O(1/t) for L-smooth convex f (Lipschitz gradient, ‖∇f(x)-∇f(y)‖ ≤ L‖x-y‖) and at rate O(exp(-μt/L)) for μ-strongly convex f (where μ = σ_min...
Fields: Engineering, Computer Science, Social Science, Economics, Game Theory
Cybersecurity bridges engineering (technical attack/defense mechanisms) and social science (human behavior, economics, game theory). The CIA triad (Confidentiality, Integrity, Availability) provides t...
Fields: Engineering, Social Science, Operations Research, Economics, Computer Science, Mechanism Design
Operations research (OR) develops algorithms for resource allocation under constraints. Market design applies these algorithms to real economic markets — transforming abstract optimization theory into...
Fields: Engineering, Social Science, Computer Science, Urban Planning
Smart city platforms aggregate IoT sensor data (traffic flow, air quality, energy consumption, pedestrian density) for real-time urban management. The data pipeline runs from edge computing (latency <...
Fields: Epidemiology, Machine Learning, Distributed Systems
Speculative analogy (to be empirically validated): FedAvg-style decentralized optimization can combine geographically distributed surveillance models while preserving local governance constraints and ...
Fields: Geophysics, Computer Science, Inverse Problems, Optimization
Both full-waveform seismic inversion and deep learning compute gradients by propagating sensitivities backward through a forward model. The mapping is non-trivial because it lets geophysics borrow opt...
Fields: Geophysics, Geostatistics, Statistics, Machine Learning, Spatial Analysis
Kriging (Krige 1951, formalised by Matheron 1963) is the minimum-variance linear unbiased estimator for spatially correlated data: Ẑ(x₀) = Σᵢ λᵢZ(xᵢ), where the optimal weights λᵢ are determined by so...
Fields: Geoscience, Machine Learning, Remote Sensing
Speculative analogy (to be empirically validated): encoder-decoder skip architectures developed for biomedical segmentation transfer to flood delineation by preserving fine boundary detail while integ...
Fields: Hydrology, Computer Science
Speculative analogy: Fourier neural operators can approximate families of PDE solution maps for groundwater flow, enabling amortized inverse-model exploration with uncertainty-aware screening before f...
Fields: Immunology, Machine Learning, Bioinformatics
Speculative analogy (to be empirically validated): Large-scale protein sequence pretraining may transfer contextual representations to TCR-antigen binding tasks similarly to repertoire-level priors us...
Fields: Immunology, Physics, Information Theory, Statistical Mechanics, Mathematics
The adaptive immune system must recognize ~10¹⁵ possible foreign antigens using only ~10⁷ circulating T-cell clones (each with a distinct T-cell receptor, TCR). This is a covering problem: the T-cell ...
Fields: Infectious Disease, Machine Learning, Structural Biology
Speculative analogy (to be empirically validated): masked-autoencoder pretraining on molecular imagery can learn reconstruction priors that improve low-SNR cryo-EM downstream tasks without requiring e...
Fields: Information Theory, Molecular Evolution, Statistical Physics, Virology
Manfred Eigen's quasispecies theory (1971) shows that a replicating population of sequences (RNA, DNA, or proteins) undergoes a phase transition at a critical mutation rate mu_c: below mu_c, a "master...
Fields: Information Theory, Epistemology, Network Science, Cognitive Science, Library Science, Science Of Science
Shannon's channel capacity theorem (C = B log₂(1 + S/N)) provides a formal framework for the scientific knowledge overload problem. Consider each scientific domain as a transmitter and each researcher...
Fields: Information Theory, Genetics, Computer Science
Established engineering practice uses sum-product / approximate message passing algorithms on graphical models for large-scale genotype phasing and related inference tasks; residual speculative analog...
Fields: Information Theory, Molecular Biology, Genetics, Evolutionary Biology
Shannon's (1948) framework maps onto molecular genetics with striking precision. The DNA alphabet has size q = 4 (A, T, G, C), so the maximum entropy per position is log₂(4) = 2 bits. The information ...
Fields: Information Theory, Computational Linguistics, Machine Learning
Shannon–McMillan–Breiman asymptotic equipartition implies typical sequences carry ~nh bits per n symbols for ergodic processes with entropy rate h. Neural language models minimize average negative log...
Fields: Linguistics, Information Theory, Cognitive Science, Statistical Physics, Complexity Science
Zipf (1949) observed that the frequency of a word is inversely proportional to its rank in the frequency table: f(r) ∝ 1/r. This power law appears in word frequencies across all natural languages, cit...
Fields: Linguistics, Mathematics, Computer Science, Cognitive Science, Formal Language Theory
Chomsky (1956, 1959) identified a hierarchy of formal languages classified by the computational power required to generate or recognize them. The four levels and their automaton equivalences: — Type 3...
Fields: Materials Science, Machine Learning, Chemistry
Speculative analogy (to be empirically validated): Bayesian-optimization acquisition policies can function as adaptive design rules analogous to sequential alloy-screening heuristics in autonomous mat...
Fields: Mathematics, Computer_Science, Type_Theory, Logic
The Curry-Howard-Lambek correspondence establishes a three-way isomorphism between typed lambda calculus, intuitionistic logic, and Cartesian closed categories; monads in Haskell are exactly monads in...
Fields: Mathematics, Computer Science
Expander graphs (high connectivity, small spectral gap in the Laplacian) are the combinatorial objects underlying modern error-correcting codes; LDPC codes and turbo codes have Tanner graphs that are ...
Fields: Mathematics, Computer Science, Signal Processing
The discrete Fourier transform (DFT) and its fast algorithm (FFT) provide an exact dual representation of any finite signal in the frequency domain; the convolution theorem (multiplication in frequenc...
Fields: Mathematics, Computer_Science, Data Science
Persistent homology computes Betti numbers (β₀: connected components, β₁: loops, β₂: voids) across all length scales simultaneously, producing a persistence diagram that is a provably stable shape fin...
Fields: Mathematics, Computer_Science
ReLU neural networks compute piecewise-linear functions that are exactly tropical polynomials in tropical (max-plus) algebra; the number of linear regions of a deep ReLU network grows exponentially wi...
Fields: Mathematics, Physics, Dynamical_Systems, Information_Theory
Deterministic chaos (positive Lyapunov exponents, sensitive dependence on initial conditions) is the physical manifestation of ergodic mixing in measure-preserving dynamical systems; the Kolmogorov-Si...
Fields: Physics, Neuroscience, Signal Processing
Stochastic resonance — where adding noise to a subthreshold signal improves detection — is the physical mechanism behind mechanoreceptor hair cell bundle noise and neural population coding; the optima...
Fields: Mathematics, Evolutionary Biology, Information Theory, Statistics
The space of probability distributions over a discrete variable forms a Riemannian manifold equipped with the Fisher information metric g_{ij} = E[∂_i log p · ∂_j log p], where i,j index parameters of...
Fields: Mathematics, Quantum Physics, Neuroscience, Machine Learning, Computational Neuroscience
Tensor networks (TN) are graphical representations of high-dimensional arrays in which each tensor is a node and contractions between shared indices are edges. Matrix product states (MPS) represent a ...
Fields: Mathematics, Approximation Theory, Computer Science, Machine Learning
Universal approximation theorem (Cybenko 1989, Hornik et al. 1989): a feedforward neural network with one hidden layer and sufficient neurons can approximate any continuous function on a compact domai...
Fields: Mathematics, Computer Science, Cybersecurity, Network Science
Lateral movement after initial compromise is often modeled as random or attacker-chosen hops on a graph of hosts, accounts, and trust relationships. Bond percolation (edges open with probability p) an...
Fields: Mathematics, Computer Science, Materials Science
The bridge is mathematical rather than material: segmentation algorithms can borrow phase-field regularization intuition, but image classes are not thermodynamic phases. The useful transfer is in inte...
Fields: Mathematics, Computer Science, Type Theory, Functional Programming
Category theory — the abstract mathematics of structure-preserving maps — is not merely analogous to functional programming; it is the precise mathematical semantics of statically-typed functional lan...
Fields: Mathematics, Logic, Computer Science, Complexity Theory, Proof Theory, Type Theory
The Cook-Levin theorem (Cook 1971, Levin 1973): SAT is NP-complete — every problem in NP polynomially reduces to Boolean satisfiability. P vs NP (Clay Millennium Problem): does every efficiently verif...
Fields: Mathematics, Computer Science, Statistics, Signal Processing, Applied Mathematics
The Shannon-Nyquist sampling theorem states that a band-limited signal must be sampled at twice the highest frequency to allow perfect reconstruction. For a signal with n degrees of freedom, n measure...
Fields: Mathematics, Computer Science, Signal Processing, Machine Learning
The convolution theorem states that convolution becomes pointwise multiplication in the Fourier domain (with appropriate boundary conditions). CNNs implement spatial convolution with learned kernels, ...
Fields: Mathematics, Number Theory, Computer Science, Cryptography, Algebra, Complexity Theory
RSA (Rivest, Shamir, Adleman 1978): public key e, private key d, modulus n = pq (product of two large primes). Key relationship: ed ≡ 1 (mod φ(n)) where φ(n) = (p-1)(q-1) is Euler's totient function. ...
Fields: Mathematical Logic, Type Theory, Computer Science, Proof Theory, Programming Language Theory
The Curry-Howard isomorphism (independently discovered by Haskell Curry in 1934 for combinatory logic and William Howard in 1969 for natural deduction) establishes an exact correspondence between the ...
Fields: Statistics, Machine Learning, Computer Science
The bridge makes the frequentist penalty/Bayesian prior equivalence explicit for model selection under correlated designs. It is useful for calibrating regularization paths, but posterior uncertainty ...
Fields: Mathematics, Computer Science, Cryptography
The chord-and-tangent group law is uniform across fields — explaining why textbooks illustrate ℂ/Λ pictorially — but security proofs and side-channel engineering operate on Galois cohomology, embeddin...
Fields: Machine Learning, Combinatorics, Computer Science
Message-passing graph neural networks (MPGNNs) are at most as powerful as the 1-Weisfeiler-Lehman (1-WL) color refinement algorithm: two graphs that 1-WL cannot distinguish will be assigned identical ...
Fields: Mathematics, Computer Science, Network Science, Geometry
Trees embed with low distortion in hyperbolic space because distances grow like logs of branching depth, matching the volume growth of hyperbolic balls. Poincaré and Lorentz models therefore yield com...
Fields: Mathematics, Computer Science
Information geometry (Amari 1985) applies differential geometry to the statistical manifold — the space of probability distributions parametrised by θ. The Fisher information matrix g_ij(θ) = E[(∂log ...
Fields: Mathematics, Computer Science, Machine Learning, Linear Algebra
A deep neural network f(x) = σ(W_L · σ(W_{L-1} · ... · σ(W_1 x))) is architecturally a composition of linear maps (weight matrices Wᵢ ∈ ℝ^{n×m}) and pointwise nonlinearities. Backpropagation computes ...
Fields: Robust Statistics, Astronomy, Computer Science
The bridge is methodological. Astronomical cross-matching can use robust geometric-estimation ideas, but sky-survey outliers are not uniformly random, so standard RANSAC sampling assumptions require d...
Fields: Mathematics, Computer Science, Machine Learning
The bridge is pedagogical and formal at the level of density theorems: both results say an expressive algebra or network family can approximate continuous functions on compact domains. It does not imp...
Fields: Mathematics, Computer Science, Logic, Type Theory, Programming Languages
The Curry-Howard isomorphism (Curry 1934 combinatory logic; Howard 1969 natural deduction) establishes: types ↔ propositions; programs ↔ proofs; program execution ↔ proof normalization; function types...
Fields: Mathematics, Computer Science, Machine Learning
Kantorovich duality expresses W₁ as a supremum over 1-Lipschitz test functions; empirical WGAN critics approximate this supremum with neural nets, and gradient-penalty variants (Gulrajani et al.) dire...
Fields: Ecology, Mathematics, Computer Science, Behavioral Ecology
Optimal foraging theory predicts a forager leaves a patch when the marginal capture rate equals the long-run average intake rate achievable in the habitat — a stopping rule derived from renewal argume...
Fields: Mechanism Design, Microeconomics, Computer Science, Game Theory
In a second-price sealed-bid auction, truthful bidding is a weakly dominant strategy: bidders should bid their values. Vickrey–Clarke–Groves mechanisms generalize this idea to allocate discrete goods ...
Fields: Mathematics, Engineering, Computer Science, Machine Learning
Convex optimization: minimize f(x) subject to x in C (convex set). The Lagrangian L(x,lambda,mu) = f(x) + lambda^T h(x) + mu^T g(x) and dual function g(lambda,mu) = inf_x L satisfy strong duality (pri...
Fields: Mathematics, Engineering, Computer Science
Lang's TreeMaker algorithm formalizes origami design: a model's silhouette is described as a stick figure (tree graph) with branch lengths; TreeMaker finds a circle/ellipse packing on the square paper...
Fields: Mathematics, Operations Research, Engineering, Industrial Engineering, Computer Science
Queuing theory analyses systems where arriving customers wait for service. The canonical M/M/1 queue (Poisson arrivals at rate λ, exponential service times with rate μ) requires utilisation ρ = λ/μ < ...
Fields: Mathematics, Engineering, Signal Processing, Harmonic Analysis, Image Processing, Statistics
Wavelets provide a multi-resolution analysis (MRA) of signals: a nested sequence of approximation spaces V_j ⊂ V_{j+1} ⊂ L²(ℝ) with scaling function φ and wavelet ψ satisfying ⟨ψ(·-k), ψ(·-l)⟩ = δ_{kl...
Fields: Mathematics, Game Theory, Evolutionary Biology, Machine Learning, Economics
Maynard Smith & Price (1973) showed that natural selection on heritable strategies converges to evolutionary stable strategies (ESS), which are exactly Nash equilibria of the payoff game defined by fi...
Fields: Linguistics, Information Theory, Mathematics, Statistical Physics, Cognitive Science
Zipf (1935, 1949) documented that in any natural language corpus the r-th most frequent word has frequency f_r ≈ C / r (Zipf's law, exponent α = 1 exactly). He proposed a "principle of least effort": ...
Fields: Numerical Analysis, Computational Fluid Dynamics, Medical Imaging, Computer Science
Finite-volume schemes maintain discrete conservation ∑ F·n Δt across faces; cut-cell methods redistribute fluxes when an embedded boundary slices Cartesian cells. Voxel segmentation assigns partial ti...
Fields: Mathematics, Medicine, Signal Processing, Topology
Topological summaries of sliding-window cardiac time-series can capture state-transition structure missed by threshold statistics. This extends established TDA disease-subtyping ideas into real-time r...
Fields: Mathematics, Neuroscience, Cognitive Science, Statistics, Information Theory
The predictive coding framework (Rao & Ballard 1999) proposes that cortical processing is bidirectional: top-down connections carry predictions x̂_L = f(x_{L+1}) from higher to lower levels, while bot...
Fields: Mathematics, Dynamical Systems, Neuroscience, Computational Neuroscience, Nonlinear Physics
Neural populations exhibit characteristic oscillations (alpha 8-12 Hz, gamma 30-80 Hz, theta 4-8 Hz, beta 12-30 Hz) whose emergence, frequency, and amplitude are governed by the bifurcation structure ...
Fields: Mathematics, Neuroscience, Computer Science, Cognitive Science, Computational Neuroscience
Temporal difference (TD) learning (Sutton 1988; Sutton & Barto 1998) defines the prediction error: δ_t = r_t + γV(s_{t+1}) − V(s_t), where r_t is the reward received, γ ∈ (0,1) is the discount factor,...
Fields: Mathematics, Physics, Signal Processing, Quantum Mechanics, Applied Mathematics
The Fourier transform F(ω) = ∫f(t)e^{-iωt}dt decomposes any square-integrable function into sinusoidal components, establishing a bijective correspondence between the time domain and frequency domain....
Fields: Mathematics, Statistical Physics, Network Science, Computer Science, Epidemiology
Percolation theory, originally developed for porous media and ferromagnetism, describes the emergence of large-scale connectivity in random structures. Site percolation on a network: each node is "occ...
Fields: Mathematics, Structural Biology, Medical Imaging, Machine Learning
Cryo-EM particle images sample continuous conformational variation; Laplacian eigenmaps provide a mathematically grounded coordinate system for this manifold. The bridge is strong but still partly spe...
Fields: Medical Imaging, Machine Learning, Inverse Problems
Speculative analogy (to be empirically validated): DDPM score fields can act as learned regularizers in MRI inverse problems, replacing hand-crafted priors while preserving fidelity constraints from s...
Fields: Medicine, Machine Learning, Health Informatics
Speculative analogy (to be empirically validated): self-attention can unify sparse longitudinal clinical events into context-aware risk representations similarly to flexible sequence transduction in l...
Fields: Network Science, Infectious Disease, Machine Learning
Speculative analogy (to be empirically validated): graph convolutional message passing can infer latent transmission linkage structure by integrating mobility, genomic, and contact-network signals und...
Fields: Neuroscience, Biophysics, Computational Neuroscience
The Tsodyks-Markram (TM) resource model of short-term synaptic depression: dx/dt = (1-x)/τ_rec - u·x·δ(t-t_spike) where x ∈ [0,1] is available vesicle fraction, τ_rec is recovery time constant, and u ...
Fields: Neuroscience, Cognitive Science, Bayesian Inference, Computational Neuroscience
Hierarchical Bayesian inference requires propagating predictions from high- level models downward and prediction errors from low-level observations upward. Rao & Ballard (1999) showed that a two-level...
Fields: Neuroscience, Synaptic Plasticity, Computer Science, Deep Learning, Computational Neuroscience
Backpropagation (Rumelhart, Hinton & Williams 1986) is an efficient algorithm for computing gradients of a loss function with respect to all parameters in a multilayer neural network via the chain rul...
Fields: Neuroscience, Computer Science, Machine Learning
Literature alignment at the objective level—CPC trains representations to predict latent summaries across temporal or view splits using contrastive classification; speculative analogy for biology—brai...
Fields: Neuroscience, Computer Science, Machine Learning
Conceptual bridge (not a literal neural isomorphism): both traditions trade fidelity of retained information against complexity or redundancy constraints; speculative analogy for practice—IB-style obj...
Fields: Reinforcement Learning, Neuroscience, Computational Neuroscience
Algorithmic intrinsic rewards encourage exploration by rewarding visits to rarely experienced states or large forward-model prediction errors; neuroscience proposes exploratory behaviors arise when ag...
Fields: Neuroscience, Computer Science
Both domains confront temporally separated events (weak tetanus vs protein synthesis arrival; write hits vs directory responses) that must reconcile local state with global consistency — tagging resem...
Fields: Neuroscience, Control Theory, Motor Control, Computational Neuroscience
The brain implements internal models (forward and inverse models) for motor control. Forward model: given efference copy of motor command u, predict sensory outcome ŷ = f(u). Inverse model: given desi...
Fields: Neuroscience, Engineering, Neural Engineering, Information Theory, Signal Processing
BCIs decode intended movement from neural population activity recorded by electrode arrays. Linear decoding: ŷ = Wx + b where x ∈ R^N is the spike rate vector from N neurons, y is decoded kinematics (...
Fields: Neuroscience, Engineering, Psychiatry, Computer Science
Computational psychiatry applies mathematical models of brain computation to explain the mechanisms of psychiatric symptoms and guide treatment. The aberrant salience hypothesis (Kapur 2003): excess s...
Fields: Neuroscience, Engineering, Signal Processing, Computational Neuroscience
The Kalman filter alternates prediction using a dynamics model with an innovation update weighted by the Kalman gain, minimizing mean-squared estimation error under Gaussian assumptions. Canonical neu...
Fields: Computational Neuroscience, Electrical Engineering, Neuromorphic Computing
Cell membrane lipid bilayer acts as capacitance C_m per area; ion channels provide conductances g giving τ_m = C_m/g. Subthreshold LIF ignores spike-generation nonlinearities but preserves low-pass fi...
Fields: Neuroscience, Control Engineering, Computational Neuroscience, Robotics
Flash & Hogan (1985, J Neurosci 5:1688) showed that human arm trajectories minimise the third derivative of position (jerk), generating smooth bell-shaped velocity profiles characteristic of minimum-j...
Fields: Neuroscience, Engineering, Control Theory, Biomedical Engineering, Computational Neuroscience
Neuroprosthetics is the engineering discipline of closing the sensorimotor loop with a brain-machine interface — decoding neural signals as control commands for prosthetic limbs and feeding sensory in...
Fields: Computational Neuroscience, Electrical Engineering, Neuromorphic Computing, Machine Learning
Biological neural computation uses action potentials (spikes): discrete, all-or-nothing pulses of ~100 mV amplitude and ~1 ms duration. Neurons transmit information via: 1. RATE CODING: firing rate r(...
Fields: Neuroscience, Information Theory, Sensory Physiology, Computational Neuroscience
The nervous system encodes stimuli as spike trains — discrete all-or-none action potentials — which can be analysed as Shannon communication channels. The channel capacity C = B log₂(1 + S/N) bounds t...
Fields: Neuroscience, Information Theory, Cognitive Science, Psychology
Ryan and Deci (2000, 27 k citations) established that intrinsic motivation, competence, and autonomy are fundamental psychological needs whose satisfaction predicts well-being. Information theory and ...
Fields: Neuroscience, Linguistics, Cognitive Science, Computational Neuroscience
Friston's free-energy principle (2010) proposes that the brain is a hierarchical generative model that minimizes variational free energy F = KL[q(h)||p(h|s)] ≈ complexity - accuracy. At each level, to...
Fields: Neuroscience, Mathematics, Information Theory
IIT (Tononi 2004, 2014) defines Φ as the minimum information generated by a system as a whole beyond its minimum information partition (MIP). Mathematically, Φ is a measure over a causal structure (di...
Fields: Neuroscience, Mathematics, Computational Neuroscience, Biophysics
Classic computational neuroscience modeled neurons as point processors (integrate- and-fire), but dendritic recordings reveal that dendrites perform active computation: NMDA receptor activation create...
Fields: Neuroscience, Mathematics, Statistical Mechanics, Machine Learning, Neural Networks, Memory Theory
Hopfield networks (1982): N binary neurons sᵢ ∈ {-1,+1} with symmetric weights Wᵢⱼ = (1/N)Σ_μ ξᵐᵢ ξᵐⱼ (Hebb rule) and dynamics sᵢ(t+1) = sgn(Σⱼ Wᵢⱼsⱼ(t)). Energy E = -½Σᵢⱼ Wᵢⱼsᵢsⱼ decreases monotonica...
Fields: Computational Neuroscience, Algebraic Topology, Mathematics, Data Science, Cognitive Neuroscience
Topological data analysis (TDA) applies algebraic topology to data clouds. The key tool is persistent homology: given a set of points (neurons), build a growing sequence of simplicial complexes (Čech ...
Fields: Systems Neuroscience, Signal Processing, Machine Learning, Dimensionality Reduction, Computational Neuroscience
Modern Neuropixels probes record from 384–960 electrodes simultaneously, capturing spikes from hundreds of neurons. Spike sorting — attributing voltage deflections to individual neurons — proceeds as:...
Fields: Neuroscience, Mathematics, Topology, Computational Neuroscience, Algebraic Topology
Neural activity exists in high-dimensional space (one dimension per neuron), but the activity patterns activated by natural stimuli lie on low-dimensional manifolds. Algebraic topology — specifically ...
Fields: Theoretical Neuroscience, Cognitive Science, Statistical Physics, Thermodynamics, Information Theory
The thermodynamic free energy in statistical mechanics is F = U - TS, where U is internal energy, T is temperature, and S is entropy. A system at equilibrium minimises F, which is equivalent to maximi...
Fields: Neuroscience, Physics, Statistical Mechanics, Computational Neuroscience
Self-organised criticality (SOC): Bak, Tang & Wiesenfeld (1987) discovered that many open dissipative systems naturally evolve toward a critical state characterised by power-law distributions, without...
Fields: Neuroscience, Statistical Mechanics, Machine Learning, Computational Neuroscience
Long short-term memory networks (Hochreiter & Schmidhuber 1997, 96 k citations) solve the vanishing gradient problem via gating mechanisms that selectively control information flow through time. Stati...
Fields: Neuroscience, Psychophysics, Physics, Information Theory, Sensory Biology, Cognitive Science
Weber's law (1834): the just noticeable difference ΔS for a stimulus of intensity S is proportional to S: ΔS/S = k (Weber fraction, constant per modality). For brightness, k ≈ 0.02; for weight, k ≈ 0....
Fields: Neuroscience, Physics, Statistical Mechanics, Computational Neuroscience
Hebb's (1949) postulate — "neurons that fire together wire together" — is formally expressed as ΔW_{ij} = η·xᵢ·xⱼ, a correlation-based learning rule that strengthens synaptic weight W_{ij} when pre-sy...
Fields: Neuroscience, Signal Processing, Sensory Biology
An FM chirp s(t) = A·cos(2π(f₀t + ½μt²)) (μ = chirp rate, BW = μ·T) has pulse compression ratio PCR = BW·T >> 1, giving range resolution δr = c/(2·BW) while retaining high energy (SNR = A²T/(2N₀)) fro...
Fields: Neuroscience, Signal Processing, Information Theory
The problem of decoding motor intent from neural population activity is an optimal state estimation problem: spike trains from N neurons encode a low-dimensional movement state x(t) with Fisher inform...
Fields: Neuroscience, Statistics, Cognitive Science, Bayesian Inference, Computational Neuroscience
Helmholtz (1867) proposed that perception is "unconscious inference" — the brain uses prior knowledge to resolve ambiguous sensory input. This informal insight has been formalised into the Bayesian br...
Fields: Neuroscience, Statistics, Signal Processing, Machine Learning, Electrophysiology
EXTRACELLULAR RECORDING MIXING MODEL: A recording electrode at position x measures a weighted sum of spike waveforms from N nearby neurons: y(t) = Σᵢ Aᵢ · sᵢ(t) + noise where Aᵢ = mixing matrix en...
Fields: Oceanography, Machine Learning, Fluid Dynamics
Speculative analogy (to be empirically validated): Spectral neural surrogates can emulate energy-transfer dynamics across scales similarly to reduced spectral ocean models used for submesoscale foreca...
Fields: Pharmacology, Machine Learning, Dynamical Systems
Speculative analogy (to be empirically validated): continuous-time latent dynamics learned by neural ordinary differential equations can serve as constrained surrogates for compartmental PK models whe...
Fields: Philosophy Of Science, Information Theory, Mathematics, Statistics, Machine Learning
Kolmogorov (1965) defined the complexity K(x) of a string x as the length (in bits) of the shortest program on a universal Turing machine U that outputs x and halts. Solomonoff (1964) independently de...
Fields: Statistical Physics, Machine Learning, Information Theory
Deep neural networks undergo a series of phenomena that are strikingly described by the language of statistical physics phase transitions: 1. **Grokking (Power et al. 2022)**: a model trains to 100% t...
Fields: Neuroscience, Computer_Science
Optogenetic tools (channelrhodopsins, halorhodopsins) implement real-time feedback control of neural circuits; light pulses are control inputs, spike rates are controlled outputs, and closed-loop opto...
Fields: Physics, Thermodynamics, Information Theory, Cognitive Science, Consciousness Studies, Neuroscience
Integrated information theory (IIT; Tononi 2004) defines consciousness as Φ, the amount of irreducible integrated information: the effective information generated by the whole system above and beyond ...
Fields: Physics, Computer Science, Neuroscience
The Hopfield neural network for associative memory is exactly the Ising spin glass model; stored memories correspond to local energy minima, retrieval is energy minimization, and the network's memory ...
Fields: Physics, Computer Science, Mathematics
Quantum annealing (Kadowaki & Nishimori 1998) uses quantum tunneling through energy barriers rather than thermal fluctuations (classical simulated annealing) to find global minima of cost functions. T...
Fields: Quantum Physics, Computer Science, Embedded Systems, Control Theory
Quantum survival amplitude after N measurements scales roughly as (1 − ΓΔt)^N for short intervals Δt, motivating exponential-in-(measurement rate) suppression resembling heuristic reliability gains wh...
Fields: Physics, Computer Science, Machine Learning
Pedagogical bridge (widely discussed, contested as literal identification): layerwise feature transformations resemble iterative coarse-graining because both discard microscopic degrees of freedom whi...
Fields: Physics, Computer Science, Machine Learning
Established modeling correspondence: RBMs define bipartite energy functions whose Gibbs distribution parallels Boltzmann weights on interacting latent-visible spins up to representation choices; specu...
Fields: Physics, Computer Science
The free energy of an Ising spin glass with random couplings, computed via the replica trick and replica-symmetry breaking (RSB) ansatz, maps exactly onto the satisfiability threshold of random k-SAT ...
Fields: Quantum Information, Condensed Matter Physics, Topological Field Theory, Quantum Computing
Kitaev's toric code (2003) is simultaneously: (A) A quantum error-correcting code with macroscopic code distance, where logical qubits are encoded in global topological degrees of freedom immune t...
Fields: Statistical Physics, Neuroscience, Machine Learning
The Hopfield (1982) model of associative memory is mathematically identical to the Sherrington-Kirkpatrick spin glass: neuron states map to spins, synaptic weights to random exchange couplings, and st...
Fields: Physics, Computer Science, Statistical Mechanics
A Boltzmann machine is a stochastic neural network whose equilibrium distribution is the Boltzmann distribution of an Ising-type Hamiltonian; training by contrastive divergence minimizes the KL diverg...
Fields: Physics, Computer_Science
The cavity method of spin glass theory (Mézard & Parisi) and the belief propagation algorithm in graphical models are identical mathematical objects; the Bethe free energy approximation corresponds to...
Fields: Computer Science, Mathematics, Physics
Diffusion generative models (DALL-E, Stable Diffusion) learn to reverse a stochastic diffusion process (data to noise) by estimating the score function nabla_x log p(x); the generative SDE is the time...
Fields: Physics, Computer Science, Statistical Mechanics
In the infinite-width limit, a deep neural network at initialization is exactly a Gaussian process with a kernel determined by the activation function (NNGP kernel); mean field theory of neural networ...
Fields: Physics, Computer Science, Quantum Information
Topological quantum error correction (surface codes, toric codes) encodes logical qubits in the global topology of anyon configurations; logical errors require macroscopic anyon movement, making decoh...
Fields: Physics, Computer_Science, Mathematics
Quantum walks replace classical random walk coin flipping with quantum superposition and interference; the probability distribution spreads ballistically (σ ∝ t) rather than diffusively (σ ∝ √t), prov...
Fields: Physics, Computer Science, Statistical Mechanics
The renormalization group (RG) flow in statistical physics — iteratively integrating out short-scale degrees of freedom — is mathematically equivalent to the hierarchical feature extraction performed ...
Fields: Physics, Computer Science, Information Theory
Lossy data compression (JPEG, MP3, rate-distortion theory) and the renormalization group (integrating out short-scale fluctuations) both perform optimal coarse- graining: both discard information that...
Fields: Computer_Science, Physics
Reservoir computing (echo state networks, liquid state machines) projects input time series through a fixed high-dimensional recurrent network (the reservoir) operating near the edge of chaos; only th...
Fields: Physics, Computer Science, Statistical Mechanics
Simulated annealing solves combinatorial optimization by mimicking thermal annealing: accepting uphill moves with probability exp(-delta_E/T) and slowly reducing T; this is exactly the Metropolis-Hast...
Fields: Physics, Computer Science, Information Theory
Boltzmann's thermodynamic entropy S = k_B ln Omega and Shannon's information entropy H = -sum p_i log p_i are the same mathematical object; physical heat dissipation and information erasure are two fa...
Fields: Computer_Science, Physics, Statistical_Mechanics, Machine_Learning
Variational Bayesian inference minimizes the variational free energy F = E[log q] - E[log p] (equivalent to maximizing the ELBO), which is identical to the Helmholtz free energy F = U - TS in statisti...
Fields: Statistical Mechanics, Macroecology, Information Theory, Biodiversity Science
Jaynes (1957) showed that the Boltzmann-Gibbs distribution is the unique probability distribution that maximizes Shannon entropy subject to known macroscopic constraints (e.g. fixed mean energy). Hart...
Fields: Economics, Computer Science, Information Theory
Sims' rational inattention model formalizes attention as a scarce cognitive resource with Shannon mutual information as the cost; optimal attention allocation under entropy cost produces price stickin...
Fields: Quantum Physics, Microwave Engineering, Electrical Engineering, Information Theory
Caves derived that a linear phase-preserving amplifier with large gain must introduce noise equivalent to at least half a quantum at the input port when referenced against the signal quadrature, trans...
Fields: Physics, Computer Engineering, Thermodynamics, Neuromorphic Computing, Information Theory
Landauer's principle (1961) establishes that logically irreversible operations — those that erase information — must dissipate at least k_BT ln 2 ≈ 3×10⁻²¹ J per bit at room temperature into the envir...
Fields: Thermodynamics, Information Theory, Cosmology, Statistical Mechanics
Three apparently separate arrows of time — thermodynamic (entropy increases), computational (Landauer: erasing one bit dissipates at least k_B T ln 2 of heat), and cosmological (the universe began in ...
Fields: Thermodynamics, Information Theory, Statistical Physics, Computer Science
Landauer (1961) proved that erasing one bit of information in a thermal environment at temperature T requires dissipating at least k_B * T * ln(2) of free energy as heat — approximately 3 zJ at room t...
Fields: Physics, Materials Science, Condensed Matter Physics, Mathematics, Quantum Computing
Topological insulators (TIs) are materials whose electronic band structure has a bulk gap (like a conventional insulator) but whose surface or edge hosts gapless, conducting states protected by time-r...
Fields: Mathematics, Physics, Information_Theory, Dynamical_Systems
The Renyi entropy of order q, H_q = (1/(1-q)) log sum_i p_i^q, generates the full multifractal spectrum f(alpha) via Legendre transform tau(q) -> f(alpha); turbulent velocity fields, strange attractor...
Fields: Physics, Mathematics, Information Theory, Quantum Gravity, Thermodynamics
Bekenstein (1973) proposed that a black hole of horizon area A carries entropy S_BH = kA/4l_P² (in natural units, S_BH = A/4G in Planck units). This is the maximum entropy that can be enclosed in a re...
Fields: Physics, Mathematics, Information Theory, Thermodynamics, Statistical Mechanics
The Boltzmann entropy S = k_B ln W and Shannon entropy H = −Σpᵢ log pᵢ are mathematically identical after substituting k_B and adjusting the logarithm base. Boltzmann counts microstates W consistent w...
Fields: Network Science, Statistical Physics, Neuroscience, Computer Science
Barabási & Albert (1999) showed that networks grown by preferential attachment — where new nodes connect preferentially to high-degree nodes ("rich get richer") — produce scale-free degree distributio...
Fields: Neuroscience, Condensed Matter Physics, Statistical Mechanics, Information Theory
Neural avalanches (cascades of activity that follow a power-law size distribution) are the biological signature of a system operating near a second-order phase transition — the same mathematical struc...
Fields: Physics, Condensed Matter Physics, Computational Neuroscience, Machine Learning, Statistical Mechanics
The Hopfield network (1982) defines an energy function for a network of N binary neurons sᵢ ∈ {-1, +1} with symmetric weights Wᵢⱼ: E = -½ Σᵢ≠ⱼ Wᵢⱼ sᵢ sⱼ This is formally identical to the Ising spi...
Fields: Physics, Social Science, Network Science, Epidemiology, Information Theory
SIR RUMOUR MODEL (Daley & Kendall 1965): Individuals are Susceptible (haven't heard), Infected (spreading), Recovered (heard but no longer spreading). Rate equations: dS/dt = -βSI dI/dt = βSI - γ...
Fields: Public Health, Machine Learning, Epidemiology
Speculative analogy (to be empirically validated): Learned surrogates of expensive agent-based epidemic simulations can support policy search similarly to reduced-form intervention response surfaces i...
Fields: Quantum Computing, Computer Science, Operations Research
Established baseline literature maps QAOA-style parameterized quantum circuits onto classical optimization landscapes; related speculative analogy (deployment-dependent): classical surrogate models tr...
Fields: Quantum Computing, Cryptography, Information Theory
BB84 quantum key distribution achieves information-theoretic security (proven secure against computationally unbounded adversaries) because any eavesdropping measurement on quantum states introduces d...
Fields: Quantum Computing, Quantum Information Theory, Computer Science
For a concatenated code of level k with physical error rate p and threshold p_th, the logical error rate scales as p_L = p_th·(p/p_th)^{2^k}. Each level of concatenation doubles the exponent, so after...
Fields: Quantum Computing, Quantum Error Correction, Classical Coding Theory, Computer Science
Quantum error correction (Shor 1995, Steane 1996) maps directly onto classical coding theory: a [[n, k, d]] quantum code encodes k logical qubits into n physical qubits with code distance d (able to c...
Fields: Quantum Computing, Quantum Information, Computer Science, Spectral Graph Theory
Childs & Goldstone showed spatial search via continuous-time quantum walk locates a marked vertex on several graph families in O(√N) time by tuning a Hamiltonian built from the graph Laplacian plus a ...
Fields: Quantum Computing, Combinatorics, Statistical Physics
Simulated annealing (SA) solves combinatorial optimization by sampling from the Boltzmann distribution P(s) ∝ exp(-E(s)/T), decreasing T to concentrate probability on the minimum. Quantum annealing (Q...
Fields: Quantum Computing, Probability Theory, Algorithm Theory
The discrete-time quantum walk on a line replaces the classical coin flip (probability distribution P(x,t) satisfying the diffusion equation) with a unitary coin operator C acting on a qubit; the resu...
Fields: Quantum Computing, Topology, Condensed Matter
Non-Abelian anyons (e.g., Fibonacci anyons, Majorana zero modes) in 2D topological phases have a braid group representation where exchanging anyons i and j applies a unitary gate U(σ_ij) on the degene...
Fields: Quantum Physics, Information Theory
Environment-induced superselection (einselection) identifies pointer states as eigenstates of the system observable that commutes with the system-environment interaction Hamiltonian H_int, explaining ...
Fields: Physics, Information Theory, Quantum Physics
The holographic entanglement entropy formula S_A = Area(gamma_A) / (4*G_N*hbar) (Ryu-Takayanagi) states that entanglement entropy of boundary region A in a CFT equals the area of the minimal bulk surf...
Fields: Quantum Physics, Condensed Matter Physics, Materials Science, Algebraic Topology, Quantum Computing
Topological insulators (TIs) are a phase of matter where the bulk band structure has a non-trivial topological invariant, even though the material is an insulator in the bulk. The topological invarian...
Fields: Radiology, Machine Learning, Pathology
Speculative analogy (to be empirically validated): residual blocks that stabilize very deep optimization can also stabilize representation transfer under histopathology stain variability when coupled ...
Fields: Seismology, Machine Learning, Geophysics
Speculative analogy (to be empirically validated): Physics-informed neural-operator constraints can regularize aftershock field forecasts analogously to stress-transfer priors in statistical seismolog...
Fields: Seismology, Signal Processing, Geophysics
The matched filter is the optimal linear filter for detecting a known signal s(t) in white Gaussian noise: h(t) = s(T-t) (time-reversed template). The output cross-correlation C(τ) = ∫s(t)·x(t+τ)dt ac...
Fields: Signal Processing, Structural Biology, Mathematics
Speculative analogy: Phase-retrieval alternating-projection methods map onto cryo-EM orientation and reconstruction inference loops....
Fields: Social Science, Information Theory, Cultural Evolution, Sociology, Communication Theory
Shannon (1948) proved that any communication channel with noise can reliably transmit information at rates up to its channel capacity C = max_{p(x)} I(X;Y), and that error rates rise exponentially abo...
Fields: Social Science, Information Theory, Statistics, Computer Science, Privacy Law
Differential privacy (Dwork et al. 2006): a mechanism M satisfies epsilon-DP if for any adjacent datasets D, D' differing by one record: P[M(D)∈S] ≤ exp(epsilon) × P[M(D')∈S]. This is a formal guarant...
Fields: Machine Learning, Social Science, Mathematics, Law And Policy, Statistics
Algorithmic fairness seeks criteria that trained classifiers should satisfy to avoid discrimination. Three prominent criteria conflict when base rates differ across groups: (1) demographic parity P(Ŷ=...
Fields: Mathematics, Social Science, Statistics, Computer Science, Epidemiology
A Bayesian network (BN) is a directed acyclic graph (DAG) in which nodes represent random variables and edges encode conditional dependencies. The joint distribution factorises as P(X₁,…,Xₙ) = ∏P(Xᵢ|p...
Fields: Social Science, Network Science, Sociology, Mathematics, Information Theory
Homophily — the tendency of similar individuals to form ties ("birds of a feather flock together") — is the dominant structural force shaping social networks. Measured by the assortativity coefficient...
Fields: Statistical Mechanics, Information Theory, Thermodynamics
Boltzmann's entropy S = k_B ln W (W = number of equally probable microstates) and Shannon's entropy H = −Σ p_i log p_i (probability distribution over messages) are the same mathematical object up to t...
Fields: Statistical Physics, Information Theory, Thermodynamics
The Crooks fluctuation theorem exp(W/kT) = exp(DeltaF/kT) * P_R(-W)/P_F(W) and the Jarzynski equality
Fields: Statistics, Bayesian Inference, Physics, Statistical Mechanics, Machine Learning
The partition function in statistical mechanics Z = Σ_x exp(-E(x)/kT) normalizes the Boltzmann distribution P(x) = exp(-E(x)/kT)/Z over all configurations x. In Bayesian inference, the posterior P(θ|d...
Fields: Statistics, Systems Biology, Computer Science
Speculative analogy: Variational latent-variable models can separate biological signal from technical noise in sparse single-cell count data....
Fields: Systems Biology, Machine Learning, Statistics
Speculative analogy (to be empirically validated): contrastive objectives that maximize agreement between paired views can align transcriptomic, epigenomic, and proteomic profiles into shared latent c...
Fields: Thermodynamics, Computer Science, Information Theory, Statistical Mechanics
Maxwell's demon (1867): a hypothetical being that monitors individual molecules in a partitioned gas container, opening a small door to let fast molecules pass to one side and slow ones to the other. ...
Fields: Virology, Information Theory, Evolutionary Biology
Eigen's quasispecies theory maps RNA virus evolution onto an information-theoretic error-correction problem: the master sequence is the optimal codeword, replication fidelity is the channel capacity, ...
Fields: Virology, Machine Learning, Evolutionary Biology
Speculative analogy (to be empirically validated): Protein language-model likelihoods can serve as soft constraints on viable mutational trajectories similarly to fitness-landscape priors used in vira...
Showing first 30 of 59 unknowns.
Know something about Computer Science? Contribute an unknown or hypothesis →
Generated 2026-05-10 · USDR Dashboard