Computational Methods

A Gentle Introduction to Particle Filters for Cognitive Modelling

January 15, 2026 · 2 min read

Particle filters, also known as Sequential Monte Carlo methods, provide a powerful framework for estimating the hidden states of dynamic systems. In cognitive science, they allow us to track how latent psychological variables evolve over time.

The State-Space Framework

Consider a system where the latent state xtx_t evolves according to a transition model, and we observe noisy measurements yty_t at each time step. The goal is to estimate the posterior distribution:

p(xty1:t)p(ytxt)p(xtxt1)p(xt1y1:t1)dxt1p(x_t \mid y_{1:t}) \propto p(y_t \mid x_t) \int p(x_t \mid x_{t-1}) \, p(x_{t-1} \mid y_{1:t-1}) \, dx_{t-1}

Under Cumulative Prospect Theory, the value function is v(x)=(xr)αv(x) = (x - r)^\alpha, where rr is the reference point and λ>1\lambda > 1 captures loss aversion.

Implementation in Python

import numpy as np
 
def particle_filter(y, n_particles=1000):
    """Simple bootstrap particle filter."""
    particles = np.random.normal(0, 1, n_particles)
    weights = np.ones(n_particles) / n_particles
 
    trajectories = []
    for t, obs in enumerate(y):
        # Propagate
        particles = particles + np.random.normal(0, 0.1, n_particles)
 
        # Weight by likelihood
        log_w = -0.5 * (obs - particles) ** 2
        weights = np.exp(log_w - np.max(log_w))
        weights /= weights.sum()
 
        # Resample
        indices = np.random.choice(n_particles, size=n_particles, p=weights)
        particles = particles[indices]
 
        trajectories.append(particles.mean())
 
    return np.array(trajectories)

The algorithm proceeds in three steps: propagate particles forward through the transition model, weight them by their likelihood given the new observation, and resample to focus computational resources on high-probability regions.