Primitive 9 of 9 — Final

SPREAD

The cognitive act of perceiving distribution—allocation across possibilities.

A single particle has a position. A mole of particles has a distribution.

The electron is not at one place in the orbital—it is spread across a probability cloud. The molecules in a gas are not at one energy—they are spread across a Boltzmann distribution. The measurement is not one value—it is spread across an uncertainty range.

SPREAD is the primitive of allocation. It answers: How is the whole distributed among the parts?

Where ACCUMULATION gathers parts into a whole, SPREAD describes how the whole is partitioned back into parts.

Part I

The Cognitive Basis of SPREAD

1.1 Perception of Distribution

Humans perceive spread directly.

Scatter: We see that objects are dispersed or clustered without counting. A scattered handful of seeds looks different from a concentrated pile.

Uncertainty: We sense that some outcomes are more likely than others. The experienced gambler, the weather-wise farmer, the seasoned clinician—all perceive probability before they calculate it.

Fairness: We perceive equal vs. unequal distribution. "That's not fair" is a child's protest against uneven spread.

1.2 Spread Requires Possibilities and Allocation

SPREAD involves:

A. A set of possibilities: The outcomes, states, positions, or values that could occur. The sample space.

B. A total quantity: What is being distributed. Probability (sums to 1). Mass. Energy. Population.

C. An allocation rule: How much goes to each possibility. The distribution function.

1.3 Discrete vs. Continuous Spread

Discrete: Finite or countable possibilities. Each has a probability.

$$P(X = x_i) = p_i, \quad \sum_i p_i = 1$$

Continuous: Uncountable possibilities (real numbers). Probability density.

$$P(a \leq X \leq b) = \int_a^b f(x) \, dx, \quad \int_{-\infty}^{\infty} f(x) \, dx = 1$$
Interactive: Discrete vs Continuous Distributions
Number of trials n n = 20
Probability p p = 0.50

As n increases, the discrete binomial distribution approaches the continuous Gaussian.

Part II

Probability Foundations

2.1 Sample Space and Events

The sample space Ω is the set of all possible outcomes.

An event A is a subset of Ω.

2.2 Probability Axioms

Probability P satisfies:

  1. Non-negativity: P(A) ≥ 0 for all events A
  2. Normalization: P(Ω) = 1
  3. Additivity: For mutually exclusive A, B: P(A ∪ B) = P(A) + P(B)

2.3 Conditional Probability

The probability of A given that B has occurred:

$$P(A|B) = \frac{P(A \cap B)}{P(B)}$$

2.4 Independence

Events A and B are independent if: $P(A \cap B) = P(A) \cdot P(B)$

2.5 Bayes' Theorem

$$P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)}$$

Relates conditional probabilities. Foundation for updating beliefs with evidence.

Part III

Random Variables and Distributions

3.1 Random Variables

A random variable X is a function from outcomes to numbers.

3.2 Probability Mass Function (Discrete)

For discrete X, the probability mass function: $p(x) = P(X = x)$

3.3 Probability Density Function (Continuous)

For continuous X, the probability density function:

$$P(a \leq X \leq b) = \int_a^b f(x) \, dx$$

3.4 Cumulative Distribution Function

The CDF: $F(x) = P(X \leq x)$

Part IV

Moments and Summary Statistics

4.1 Expected Value (Mean)

The expected value or mean:

$$E[X] = \mu = \sum_x x \cdot p(x) \quad \text{or} \quad \int_{-\infty}^{\infty} x \cdot f(x) \, dx$$

4.2 Variance and Standard Deviation

Variance:

$$\text{Var}(X) = \sigma^2 = E[(X - \mu)^2] = E[X^2] - (E[X])^2$$

Standard deviation: $\sigma = \sqrt{\text{Var}(X)}$

4.3 Properties

PropertyExpected ValueVariance
ScalingE[aX + b] = aE[X] + bVar(aX + b) = a²Var(X)
SumE[X + Y] = E[X] + E[Y]Var(X + Y) = Var(X) + Var(Y)*

*if X, Y independent

Part V

Important Distributions

5.1 The Gaussian (Normal) Distribution

The Gaussian or normal distribution:

$$f(x) = \frac{1}{\sigma\sqrt{2\pi}} \exp\left(-\frac{(x-\mu)^2}{2\sigma^2}\right)$$
Interactive: Gaussian Explorer
Mean μ μ = 0.0
Std Dev σ σ = 1.0
68% (μ ± σ)
95% (μ ± 2σ)
99.7% (μ ± 3σ)

The Gaussian is characterized by its mean (center) and standard deviation (width).

5.2 Other Important Distributions

DistributionTypeMeanVariance
Binomial(n, p)Discretenpnp(1-p)
Poisson(λ)Discreteλλ
Uniform[a,b]Continuous(a+b)/2(b-a)²/12
Exponential(λ)Continuous1/λ1/λ²
Part VI

SPREAD in Physical Systems

6.1 The Boltzmann Distribution

At thermal equilibrium, the probability of a system being in state i with energy Eᵢ:

$$P_i = \frac{e^{-E_i/k_BT}}{Z}, \quad Z = \sum_i g_i e^{-E_i/k_BT}$$

where Z is the partition function.

Interactive: Boltzmann Distribution
Temperature T T = 300 K
kT = 2.49 kJ/mol
⟨E⟩ = 0.00 kJ/mol

Temperature controls how population spreads across energy levels. Low T → ground state; High T → even spread.

6.2 The Maxwell-Boltzmann Speed Distribution

For gas molecules of mass m at temperature T:

$$f(v) = 4\pi \left(\frac{m}{2\pi k_BT}\right)^{3/2} v^2 e^{-mv^2/2k_BT}$$
Interactive: Maxwell-Boltzmann Speed Distribution
Most Probable
Mean
RMS
Temperature T T = 300 K
Molar Mass M M = 28 g/mol (N₂)

Higher T or lower M shifts the distribution to faster speeds and broadens it.

Part VII

SPREAD in Chemical Systems

7.1 Entropy and SPREAD

Entropy measures the spread of probability:

$$S = -k_B \sum_i P_i \ln P_i = k_B \ln W$$

The second law: systems evolve toward maximum spread consistent with constraints.

7.2 Orbital Probability Density

The electron in a hydrogen atom has probability density $\rho(\mathbf{r}) = |\psi(\mathbf{r})|^2$

The radial distribution function: $P(r) = 4\pi r^2 |\psi|^2$

Interactive: Hydrogen Orbital Probability
Orbital
Most Probable r = 1.0 a₀

The radial distribution P(r) = 4πr²|ψ|² shows where the electron is most likely found.

Part VIII

The Central Limit Theorem

8.1 Statement

The Central Limit Theorem: The sum (or mean) of many independent random variables approaches a Gaussian distribution, regardless of the original distribution.

8.2 Implications

This explains why the Gaussian appears everywhere:

Interactive: Central Limit Theorem
Sample Size n n = 1
Samples Drawn 0

Sample from a uniform distribution. As n increases, the distribution of means becomes Gaussian.

Part IX

SPREAD and Other Primitives

9.1 SPREAD and ACCUMULATION

ACCUMULATION and SPREAD are duals.

ACCUMULATION: ∫f(x)dx (gather density into total)
SPREAD: f(x) = dF/dx (distribute total into density)

The PDF is the derivative of the CDF. The CDF is the integral of the PDF.

9.2 SPREAD and SAMENESS

Maximum SPREAD is a form of SAMENESS: uniform distribution treats all outcomes identically.

Equilibrium maximizes entropy (SPREAD) subject to constraints (SAMENESS of total energy).

9.3 SPREAD and RATE

Distributions evolve in time. The master equation describes RATE of change of SPREAD:

$$\frac{dP_i}{dt} = \sum_j (k_{ji}P_j - k_{ij}P_i)$$

At equilibrium, dPᵢ/dt = 0. The distribution is stationary.

Part X

Summary

10.1 What SPREAD Is

SPREAD is distribution—how a total is allocated across possibilities.

The perception is primary. We see scatter vs. clustering. We feel uncertainty. We recognize fairness and unfairness in allocation.

The formalizations are secondary: probability, density functions, distributions, entropy. These tools quantify spread and enable prediction of aggregate behavior from individual randomness.

Key Concepts

Probability: P(A) ∈ [0,1]. Measure of likelihood.

PDF/PMF: How probability is spread across values.

Mean: Center of mass of the distribution.

Variance: Measure of spread width.

Boltzmann distribution: How population spreads across energy levels.

Entropy: Measure of spread; maximized at equilibrium.

All Nine Primitives Complete

The cognitive foundation of Chemical Thinking is now established.

10.2 The Nine Primitives

PrimitivePerceptionTools
COLLECTION"There are many"Sets, counting
ARRANGEMENT"Order matters"Permutations, matrices
DIRECTION"It points"Vectors, dot product
PROXIMITY"Near vs far"Functions, limits
SAMENESS"Unchanged"Symmetry, eigenvalues
CHANGE"Becoming"Derivatives
RATE"How fast"Diff eq, kinetics
ACCUMULATION"All together"Integrals
SPREAD"Distributed"Probability, distributions

The primitives form a complete basis for perceiving and formalizing chemical phenomena.

Endnotes

[1] The perception of probability and risk is discussed in Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
[2] The axiomatic foundation of probability is due to Kolmogorov, A.N. (1933). Foundations of the Theory of Probability.
[3] The Boltzmann distribution and statistical mechanics are covered in McQuarrie, D.A. (2000). Statistical Mechanics. University Science Books.
[4] The central limit theorem is proven in Feller, W. (1968). An Introduction to Probability Theory and Its Applications (3rd ed.). Wiley.