Computational Methods
A Gentle Introduction to Particle Filters for Cognitive Modelling
January 15, 2026 · 2 min read
Particle filters, also known as Sequential Monte Carlo methods, provide a powerful framework for estimating the hidden states of dynamic systems. In cognitive science, they allow us to track how latent psychological variables evolve over time.
The State-Space Framework
Consider a system where the latent state evolves according to a transition model, and we observe noisy measurements at each time step. The goal is to estimate the posterior distribution:
Under Cumulative Prospect Theory, the value function is , where is the reference point and captures loss aversion.
Implementation in Python
import numpy as np
def particle_filter(y, n_particles=1000):
"""Simple bootstrap particle filter."""
particles = np.random.normal(0, 1, n_particles)
weights = np.ones(n_particles) / n_particles
trajectories = []
for t, obs in enumerate(y):
# Propagate
particles = particles + np.random.normal(0, 0.1, n_particles)
# Weight by likelihood
log_w = -0.5 * (obs - particles) ** 2
weights = np.exp(log_w - np.max(log_w))
weights /= weights.sum()
# Resample
indices = np.random.choice(n_particles, size=n_particles, p=weights)
particles = particles[indices]
trajectories.append(particles.mean())
return np.array(trajectories)The algorithm proceeds in three steps: propagate particles forward through the transition model, weight them by their likelihood given the new observation, and resample to focus computational resources on high-probability regions.