Academic Research Project • Jain University
A Statistical Mechanics approach simulating the early universe using NASA FIRAS Data, Machine Learning, and Three-State Gibbs Ensembles.
"The law that entropy always increases holds, I think, the supreme position among the laws of Nature."
— Arthur Eddington
The behaviour of particles in the early universe cannot be understood by deterministic tracking. Instead, we must examine the underlying statistical structure. The universe is inherently probabilistic.
Equilibrium does not arise from strict Newtonian motion, but from probability. The system naturally evolves toward the state that can be realized in the greatest number of ways.
A microstate represents a complete specification of all particle properties, while a macrostate is defined by observable quantities like energy and temperature.
A powerful framework where the universe is viewed as part of a vast collection of possible configurations, exploring a "space" of all accessible statistical states.
This simulation is directly grounded in physical laws. The early universe existed in a state of thermal equilibrium, where the distribution of radiation energy was governed strictly by Planck's law.
This equation tells us that energy is not random; it follows a precise statistical distribution. To bridge theoretical physics and computational modelling, we transform the intensity values into a normalized probability distribution:
This normalizes the data, fundamentally turning physics into probability, allowing our simulation engine to sample the data as the likelihood of quantum particle states. Everything culminates in maximizing thermodynamic entropy: S = k \ln W.
Frequency (v) vs Intensity (I). The curve confirms the early universe was in thermal equilibrium following a blackbody distribution.
Click on any thermodynamic framework below to dive deeper into its statistical mechanics, mathematics, and code implementation.
energies = np.ones(N) * 0.5
energies = kde.sample(N)
np.random.rand() < 0.05: add()
np.random.rand() < 0.03: rem()
A deep dive into how the Python architecture translates mathematical theory into a live, three-dimensional physical engine.
urllib.request.urlretrieve(...)
This is NOT random data. It is live NASA FIRAS CMB data representing actual early universe radiation.
kde = KernelDensity(...)
The Machine Learning KDE model reconstructs the underlying probability density of energy states derived from observational data. It learns the hidden statistical structure of the universe.
samples = kde.sample(N)
Particles actively choose their energy states according to the learned probability. This represents probability-weighted sampling as derived in the PDF.
vel += H0 * pos
Models volume exchange through continuous spatial expansion, reflecting the Hubble expansion constant (H0).
# ==========================================================
# ULTIMATE COSMIC ENGINE (NASA + ML + 3 ENSEMBLES + EXPLAIN)
# ==========================================================
import numpy as np
import urllib.request
import plotly.graph_objects as go
from plotly.subplots import make_subplots
from sklearn.neighbors import KernelDensity
print("Summoning the Universe...")
# ==========================================================
# SETTINGS
# ==========================================================
MODE = "canonical"
# OPTIONS: "microcanonical", "canonical", "grand"
N = 400
G = 0.0005
H0 = 0.012
dt = 0.05
steps = 100
# ==========================================================
# LOAD NASA FIRAS DATA
# ==========================================================
url = "https://lambda.gsfc.nasa.gov/data/cobe/firas/monopole_spec/firas_monopole_spec_v1.txt"
urllib.request.urlretrieve(url, "firas.txt")
data = np.loadtxt("firas.txt", comments="#")
freq = data[:, 0]
intensity = data[:, 1]
intensity = intensity / np.max(intensity)
# ==========================================================
# ML MODEL (KDE)
# ==========================================================
kde = KernelDensity(bandwidth=0.05)
kde.fit(freq.reshape(-1,1), sample_weight=intensity)
# ==========================================================
# ENERGY INITIALIZATION (3 ENSEMBLES)
# ==========================================================
if MODE == "microcanonical":
energies = np.ones(N) * 0.5
elif MODE == "canonical":
samples = kde.sample(N)
energies = samples.flatten()
energies = (energies - energies.min()) / (energies.max() - energies.min())
elif MODE == "grand":
samples = kde.sample(N)
energies = samples.flatten()
energies = (energies - energies.min()) / (energies.max() - energies.min())
# ==========================================================
# INITIAL BIG BANG
# ==========================================================
pos = np.random.normal(0, 0.05, (N,3))
vel = np.random.normal(0, 1.0, (N,3))
vel += pos * 3
mass = energies + 0.1
# ==========================================================
# FAST GRAVITY
# ==========================================================
def gravity_fast(pos, mass):
dx = pos[:, np.newaxis, :] - pos[np.newaxis, :, :]
dist = np.linalg.norm(dx, axis=2) + 1e-5
force = G * dx / dist[:,:,None]**3
acc = np.sum(force * mass[np.newaxis,:,None], axis=1)
return acc
# ==========================================================
# BACKGROUND STARS
# ==========================================================
stars = 800
sx = np.random.uniform(-20,20,stars)
sy = np.random.uniform(-20,20,stars)
sz = np.random.uniform(-20,20,stars)
# ==========================================================
# CREATE FIGURE WITH GRAPH PANEL
# ==========================================================
fig = make_subplots(
rows=1, cols=2,
specs=[[{'type': 'scene'}, {'type': 'xy'}]],
column_widths=[0.7, 0.3],
subplot_titles=("Universe", "Energy Distribution")
)
# ==========================================================
# SIMULATION LOOP
# ==========================================================
frames = []
for t in range(steps):
# Hubble expansion
vel += H0 * pos
# Gravity
vel += gravity_fast(pos, mass) * dt
# Update positions
pos += vel * dt
# GRAND CANONICAL EFFECTS
if MODE == "grand":
if np.random.rand() < 0.05:
new_pos = np.random.normal(0, 0.1, (5,3))
new_vel = np.random.normal(0, 1, (5,3))
new_energy = kde.sample(5).flatten()
pos = np.vstack([pos, new_pos])
vel = np.vstack([vel, new_vel])
energies = np.concatenate([energies, new_energy])
mass = energies + 0.1
if len(pos) > 200 and np.random.rand() < 0.03:
pos = pos[:-5]
vel = vel[:-5]
energies = energies[:-5]
mass = energies + 0.1
# Histogram (explanation)
hist, bins = np.histogram(energies, bins=30)
frames.append(go.Frame(
data=[
# stars
go.Scatter3d(
x=sx, y=sy, z=sz,
mode='markers',
marker=dict(size=1, color='white', opacity=0.1)
),
# particles
go.Scatter3d(
x=pos[:,0], y=pos[:,1], z=pos[:,2],
mode='markers',
marker=dict(
size=4,
color=energies,
colorscale='Turbo',
opacity=0.95
)
),
# histogram
go.Bar(
x=bins[:-1],
y=hist,
)
]
))
# ==========================================================
# INITIAL FRAME
# ==========================================================
fig.add_trace(go.Scatter3d(
x=sx, y=sy, z=sz,
mode='markers',
marker=dict(size=1, color='white', opacity=0.1)
), row=1, col=1)
fig.add_trace(go.Scatter3d(
x=pos[:,0], y=pos[:,1], z=pos[:,2],
mode='markers',
marker=dict(size=4, color=energies, colorscale='Turbo')
), row=1, col=1)
hist, bins = np.histogram(energies, bins=30)
fig.add_trace(go.Bar(x=bins[:-1], y=hist), row=1, col=2)
# ==========================================================
# LAYOUT (SCI-FI + EXPLANATION)
# ==========================================================
fig.update_layout(
title=f"""
COSMIC ENGINE — {MODE.upper()} ENSEMBLE
🔵 Blue = Low Energy (Cold States)
🔴 Red = High Energy (Hot States)
Microcanonical → Fixed Energy Universe
Canonical → Thermal (NASA Data Driven)
Grand Canonical → Particle Creation & Destruction
""",
paper_bgcolor="black",
font=dict(color="cyan"),
scene=dict(
bgcolor="black",
xaxis=dict(visible=False),
yaxis=dict(visible=False),
zaxis=dict(visible=False)
),
updatemenus=[{
"type": "buttons",
"buttons": [
{
"label": "▶ Run Universe",
"method": "animate",
"args": [None, {
"frame": {"duration": 40},
"transition": {"duration": 0},
"fromcurrent": True,
"mode": "immediate"
}]
}
]
}]
)
fig.frames = frames
# ==========================================================
# EXPORT HTML
# ==========================================================
fig.write_html("ultimate_cosmic_engine.html")
print("SUCCESS: Open ultimate_cosmic_engine.html")
Running WebGL Plotly Interface
Color Map
Blue = Low Energy
Red = High Energy
Structure
Dense Center = High Prob. States
Outer Region = Low Prob. Expansion
Establishing connection to simulation payload...
"The simulation demonstrates that particle distributions in the early universe are governed by probabilistic laws derived from real observational data. By mapping radiation intensity into a probability distribution and implementing Gibbs ensemble frameworks, the system reproduces key thermodynamic behaviors such as equilibrium, particle exchange, and volume expansion. The inclusion of microcanonical, canonical, and grand canonical ensembles provides a comprehensive representation of different physical regimes, with the grand canonical model closely reflecting the dynamic nature of the early universe."
Embedded Academic Reader. The complete paper containing mathematical derivations and theoretical explanations is loaded directly below.
Yash Yadav
The early universe can be understood as a high-temperature statistical system governed by probabilistic laws rather than deterministic trajectories. In this work, we construct a data-driven computational framework that connects observational cosmology with statistical mechanics. Using Cosmic Microwave Background (CMB) spectral data from the COBE/FIRAS experiment, we derive a probability distribution of energy states and employ it to generate particle configurations within a Gibbs ensemble framework. The model incorporates microcanonical, canonical, and grand canonical ensembles to explore different thermodynamic regimes. A kernel density estimation (KDE) approach is used to approximate the underlying distribution of states. The resulting simulation demonstrates that equilibrium structure, particle clustering, and energy distributions emerge naturally from probabilistic sampling and expansion dynamics. This study highlights how macroscopic cosmological behavior can arise from statistical principles and provides a computational bridge between observational data and theoretical physics.
The evolution of the early universe is fundamentally governed by statistical laws arising from large ensembles of interacting particles. Rather than tracking individual particle trajectories, statistical mechanics provides a framework in which macroscopic behavior emerges from distributions over microscopic states. Observational evidence from the Cosmic Microwave Background (CMB) confirms that the early universe existed in a near-equilibrium thermal state characterized by a blackbody spectrum.
This work adopts a probabilistic perspective, where particle distributions are reconstructed from observational data and interpreted through the lens of Gibbs ensemble theory. By combining data-driven modeling with computational simulation, we aim to demonstrate how large-scale structure and equilibrium properties emerge naturally from statistical principles.
The spectral energy density of radiation in thermal equilibrium is described by Planck's law:
This distribution arises from Bose-Einstein statistics and governs photon occupation numbers in equilibrium.
To translate observational data into a simulation framework, intensity values are normalized into a probability distribution:
This allows the radiation spectrum to be interpreted as a statistical distribution over energy states.
The macroscopic state of the system is determined by entropy:
where \(W\) represents the number of accessible microstates. Systems evolve toward configurations that maximize entropy, corresponding to equilibrium.
The system is analyzed under three ensemble conditions:
The total energy is conserved:
This represents an isolated system with no exchange of energy or particles.
The system exchanges energy with a thermal reservoir. The probability of a state with energy \(E_{i}\) is:
where \(\beta=1/kT\) and \(Z\) is the partition function.
Both energy and particle number fluctuate. The probability distribution becomes:
where \(\mu\) is the chemical potential and \(\Xi\) is the grand partition function.
This ensemble is particularly relevant to the early universe, where particle creation and annihilation processes occur.
The dataset used in this study is obtained from the COBE/FIRAS instrument, which provides high-precision measurements of the CMB spectrum. This dataset represents one of the strongest confirmations of thermal equilibrium in the early universe.
To approximate the underlying probability distribution, we employ Kernel Density Estimation:
This enables smooth sampling of energy states consistent with observational data.
Particles are initialized in three-dimensional space with stochastic positions and velocities. The system evolves under:
The grand canonical regime further includes stochastic particle creation and annihilation, modeling early-universe dynamics.
The simulation produces several key behaviors:
Energy histograms remain consistent with the underlying probability distribution, indicating preservation of statistical equilibrium.
The results demonstrate that large-scale structure can emerge from purely statistical rules without requiring explicit deterministic modeling of every interaction. The grand canonical ensemble provides the most physically relevant description, capturing the dynamic exchange processes characteristic of the early universe.
This reinforces the idea that cosmological evolution is fundamentally governed by probabilistic principles.
The present model includes several simplifying assumptions:
Despite these limitations, the model captures essential statistical features of early-universe behavior.
This work demonstrates that particle distributions in the early universe can be effectively reconstructed using statistical mechanics and data-driven modeling. By integrating observational data, Gibbs ensemble theory, and computational simulation, the study provides a framework for understanding how equilibrium and structure emerge from probabilistic laws.
Your AI Research Assistant • Powered by Live Render Backend
Online & connected
Mickey runs on a live Python backend. The first question may take ~30s to wake the server.