Skip to content

Commit 03a94a0

Browse files
benkroehsmuffgaga
authored andcommitted
feat: Extend UHEI/hxtorch and UHEI/jaxsnn
1 parent 5d10e96 commit 03a94a0

File tree

2 files changed

+14
-2
lines changed
  • content/neuromorphic-computing/software/snn-frameworks

2 files changed

+14
-2
lines changed

content/neuromorphic-computing/software/snn-frameworks/hxtorch/index.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,13 @@ supports_hardware: True
1111
supports_NIR: True
1212
language: Python
1313
draft: false
14+
maintainer: Electronic Visions Group
1415
---
1516

1617
## Overview
1718

18-
**hxtorch** is a deep learning Python library used for numerical simulation, neuromorphic emulation and training of spiking neural networks (SNNs) with BrainScaleS-2 neuromorphic hardware in-the-loop.
19+
**hxtorch** is a deep learning Python library used for numerical simulation, neuromorphic emulation and training of spiking neural networks (SNNs). Built on top of PyTorch, it integrates the automatic differentiation and modular design of the PyTorch ecosystem with neuromorphic experiment execution, enabling hardware-in-the-loop training workflows on the neuromorphic hardware system BrainScaleS-2 .
20+
21+
The library abstracts the hardware configuration and experiment execution, while allowing users to define networks using familiar PyTorch modules such as LIF and LI neuron layers and synaptic connections. By separating network definition from execution, hxtorch supports both software simulation and hardware emulation within a single, unified API.
22+
23+
The framework supports surrogate gradient-based learning, custom backward functions and seamless conversion between sparse, event-based observables and dense PyTorch tensors. It is designed to facilitate iterative model development, hybrid simulation/emulation and the integration of hardware observables such as spike trains and membrane voltages directly into the training loop.

content/neuromorphic-computing/software/snn-frameworks/jaxsnn/index.md

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,15 @@ supports_hardware: True
1111
supports_NIR: True
1212
language: Python
1313
draft: false
14+
maintainer: Electronic Visions Group
1415
---
1516

1617
## Overview
1718

18-
**jaxsnn** is a deep learning Python library used for event-based numerical simulation, neuromorphic emulation and training of spiking neural networks (SNNs) with BrainScaleS-2 neuromorphic hardware in-the-loop.
19+
**jaxsnn** is a deep learning Python library used for event-based numerical simulation, neuromorphic emulation and training of spiking neural networks (SNNs) with BrainScaleS-2 neuromorphic hardware in-the-loop. It is maintained by the Electronic Visions group at Heidelberg University.
20+
21+
Unlike conventional deep learning libraries, which rely on dense tensor representations and time-discretized updates, jaxsnn is designed for event-driven computation. It directly operates on asynchronous spike events and supports gradient-based learning using methods such as EventProp and “Fast & Deep” spike-time coding. The library leverages JAX’s automatic differentiation, just-in-time compilation (via XLA) and support for hardware acceleration to enable efficient and composable training of biologically inspired SNNs.
22+
23+
jaxsnn is tailored for integration with analog neuromorphic systems such as BrainScaleS-2. It supports hardware-in-the-loop training by offloading the forward pass to neuromorphic hardware while computing gradients in software. For development and testing, jaxsnn can also be used as a pure simulator framework.
24+
25+
With native event-based processing, support for custom VJP definitions and a modular, JAX-compatible design, jaxsnn provides a flexible platform for bridging the gap between modern machine learning tools and the sparse, real-time nature of neuromorphic computing. It is particularly suited for research on energy-efficient learning algorithms, continuous-time dynamics, and hardware-constrained SNN modeling.

0 commit comments

Comments
 (0)