[Paper] Autonomous Learning of Attractors for Neuromorphic Computing with Wien Bridge Oscillator Networks

Published: (December 16, 2025 at 02:33 PM EST)
4 min read
Source: arXiv

Source: arXiv - 2512.14869v1

Overview

The paper introduces a neuromorphic computing primitive built from networks of Wien‑bridge oscillators whose phase relationships encode information. By wiring these oscillators with tunable resistive couplers and applying a simple Hebbian learning rule, the system learns and recalls patterns continuously, without distinct training and inference stages. This work demonstrates that such analog oscillator networks can act as energy‑based attractor machines, opening a path toward low‑power, self‑organizing hardware for pattern recognition and adaptive control.

Key Contributions

  • Oscillatory neuromorphic primitive: Implements pattern storage in the relative phases of coupled Wien‑bridge oscillators.
  • Local Hebbian learning rule: Continuously updates resistive couplings based on instantaneous phase correlations, merging learning and inference into a single dynamical process.
  • Energy‑based analysis: Derives a Kuramoto‑style phase model with an effective energy function, proving that learned phase configurations become attractor states.
  • Hardware validation: Shows the concept works both in simulation and on a physical prototype, confirming robustness to component tolerances and noise.
  • 2‑4‑2 architecture with hidden layer: Demonstrates a small multilayer network where multiple hidden‑layer phase configurations map to the same visible output, mimicking distributed representations.
  • Surprise‑driven dynamics: Observes transient energy spikes when inputs change, illustrating how the network autonomously reshapes its energy landscape to reduce “surprise”.

Methodology

  1. Circuit building block – Each node is a Wien‑bridge oscillator (a classic RC‑based sinusoidal generator). The oscillators are coupled through programmable resistive links that can be adjusted on‑the‑fly.
  2. Phase encoding – Information is stored in the relative phases between oscillators (e.g., a 0° vs. 180° offset represents a binary bit).
  3. Learning rule – A Hebbian update (Δw_ij ∝ cos(θ_i - θ_j)) is applied locally to each coupling resistor, strengthening links when two oscillators are in phase and weakening them when out of phase. This rule runs continuously as the circuit oscillates.
  4. Mathematical model – The dynamics are reduced to a Kuramoto‑type phase equation with an associated energy function E(θ). Minima of this energy correspond to stable phase patterns (attractors).
  5. Experimental setup – A prototype board implements a 2‑4‑2 network (2 visible → 4 hidden → 2 visible). Input patterns are injected by fixing the phases of the visible oscillators; the hidden layer evolves autonomously, and the output phases are read back.
  6. Evaluation – Simulations and hardware measurements track phase convergence, energy evolution, and recall accuracy across multiple training cycles.

Results & Findings

  • Attractor formation: After a few learning cycles, the network settles into distinct phase configurations that remain stable even after the learning rule is turned off, confirming attractor behavior.
  • Energy landscape reshaping: Switching inputs produces a sharp rise in the energy function, followed by a smooth relaxation as the network finds a new attractor, mirroring “surprise reduction” in predictive coding theories.
  • Robust recall: The hardware prototype reliably reproduces stored patterns despite component mismatches and thermal noise, with recall errors below 5 % for the tested patterns.
  • Hidden‑layer degeneracy: Multiple hidden‑layer phase states converge to the same visible output, demonstrating distributed encoding and the potential for richer internal representations.
  • Scalability hints: Simulations suggest that adding more oscillators and layers preserves the learning dynamics, though coupling strength and frequency dispersion become critical design parameters.

Practical Implications

  • Ultra‑low‑power inference: Since the system relies on passive RC oscillators and analog resistive updates, power consumption can be orders of magnitude lower than digital ASICs for similar pattern‑matching tasks.
  • Edge AI & sensor fusion: Continuous learning without a separate training phase makes the hardware ideal for on‑device adaptation (e.g., wearable health monitors that learn a user’s baseline rhythms).
  • Robustness to variability: The attractor dynamics naturally tolerate component drift and noise, which is valuable for harsh environments (industrial IoT, aerospace).
  • Hardware‑native energy‑based models: The approach provides a physical substrate for energy‑based machine‑learning frameworks (Boltzmann machines, Hopfield networks) that are currently simulated in software.
  • Rapid prototyping: Wien‑bridge oscillators are inexpensive and easy to fabricate on standard PCB processes, lowering the barrier for research labs and startups to experiment with neuromorphic circuits.

Limitations & Future Work

  • Scaling challenges: As the network grows, maintaining tight frequency matching among oscillators and preventing unwanted synchronization becomes harder.
  • Learning rule simplicity: The Hebbian update is linear and may struggle with more complex, non‑binary patterns; richer plasticity mechanisms (e.g., spike‑timing‑dependent plasticity) could be explored.
  • Readout latency: Extracting phase information requires analog‑to‑digital conversion or phase‑locked loops, which adds overhead for high‑speed applications.
  • Hardware variability: While the prototype tolerates some mismatch, large‑scale integration will need systematic calibration or adaptive compensation circuits.
  • Future directions: The authors suggest investigating heterogeneous oscillator types, on‑chip programmable resistors, and hierarchical multilayer architectures to push toward practical, large‑scale neuromorphic processors.

Authors

  • Riley Acker
  • Aman Desai
  • Garrett Kenyon
  • Frank Barrows

Paper Information

  • arXiv ID: 2512.14869v1
  • Categories: cs.NE, cs.ET, nlin.AO
  • Published: December 16, 2025
  • PDF: Download PDF
Back to Blog

Related posts

Read more »