You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: contributors: Update Matei Stan info and add photo
This commit updates Matei Stan's contributor information with his LinkedIn profile and adds his photo.
Also includes his photo for the student talks section.
Fixes minor issues with speaker bio and references in student talk.
description: "Matei Stan is a PhD student at the University of Manchester, focusing on State Space Models (SSMs) and their application in Spiking Neural Networks for long-range sequential tasks."
4
-
image: "matei-stan.jpg"# Ask him for this photo
4
+
image: "matei-stan.jpg"
5
5
social:
6
6
- icon: "fa-brands fa-linkedin"
7
-
link: "https://www.linkedin.com/in/username"# Ask for his LinkedIn/GitHub/Website URL
7
+
link: "https://www.linkedin.com/in/matei-stan"
8
8
title: "linkedin"
9
9
draft: false
10
10
---
@@ -13,4 +13,4 @@ Matei Stan is a third-year PhD student in the Department of Computer Science at
13
13
Manchester, UK. He is supervised by Dr Oliver Rhodes in the Advanced Processor Technologies
14
14
(APT) group. In his PhD work, Matei has primarily focused on the applications of deep State Space
15
15
Models (SSMs), such as S4, in neuromorphic computing, and their potential in scaling
16
-
energy-efficient algorithms for long-range sequential tasks.
16
+
energy-efficient algorithms for long-range sequential tasks.
Copy file name to clipboardExpand all lines: content/neuromorphic-computing/student-talks/learning-long-sequences-in-snns/index.md
+8-9Lines changed: 8 additions & 9 deletions
Original file line number
Diff line number
Diff line change
@@ -3,20 +3,20 @@ title: "Student Talk: Learning Long Sequences in Spiking Neural Networks"
3
3
author:
4
4
- "Matei Stan"
5
5
date: 2025-07-27
6
-
start_time: "08:00"
7
-
end_time: "09:00"
6
+
start_time: "08:30"
7
+
end_time: "09:45"
8
8
time_zone: "EST"
9
9
description: "Explore how State Space Models (SSMs) combined with Spiking Neural Networks (SNNs) can outperform Transformers on long-sequence tasks, and learn about a novel feature mixing layer that challenges assumptions about binary activations."
10
10
upcoming: true
11
-
upcoming_url: "https://link.to.the.event"# Ask him for this
image: "banner.png"# Ask him for a 1200x630px banner image
14
-
speaker_photo: "matei-stan.jpg"# This should be the same as his contributor photo
13
+
image: "banner.png"
14
+
speaker_photo: "matei-stan.jpg"
15
15
type: "student-talks"
16
16
speaker_bio: "Matei Stan is a third-year PhD student in the Department of Computer Science at the University of Manchester, UK. He is supervised by Dr Oliver Rhodes in the Advanced Processor Technologies (APT) group. In his PhD work, Matei has primarily focused on the applications of deep State Space Models (SSMs), such as S4, in neuromorphic computing, and their potential in scaling energy-efficient algorithms for long-range sequential tasks."
17
17
---
18
18
19
-
Matei’s published work, “Learning Long Sequences in Spiking Neural Networks” [2], systematically investigates, for the first time, the intersection of the State‑of‑The‑Art SSMs with Spiking Neural Networks (SNNs) for long‑range sequence modelling. Results suggest that SSM‑based SNNs can outperform the Transformer on all tasks of a well‑established long‑range sequence modelling benchmark - the “Long-Range Arena” [3]. It is also shown that the SSM‑based SNNs can outperform current State‑of‑The‑Art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as Large Language Models, on neuromorphic hardware for energy-efficient long-range sequence modelling.
19
+
Matei’s published work, “Learning Long Sequences in Spiking Neural Networks” [1], systematically investigates, for the first time, the intersection of the State‑of‑The‑Art State Space Models (SSMs) with Spiking Neural Networks (SNNs) for long‑range sequence modelling. Results suggest that SSM‑based SNNs can outperform the Transformer on all tasks of a well‑established long‑range sequence modelling benchmark - the “Long-Range Arena” [2]. It is also shown that the SSM‑based SNNs can outperform current State‑of‑The‑Art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as Large Language Models, on neuromorphic hardware for energy-efficient long-range sequence modelling.
20
20
21
21
This talk will highlight, at a high level, the similarities in computational primitives between SSMs and the existing neuromorphic standards such as Leaky Integrate-and-Fire (LIF) neurons. It will also focus on the specific drawbacks brought about by the introduction of binary activations in SSMs, as well as the extent to which these can be mitigated by the development of more accurate surrogate gradient methods that account for non-differentiability. Finally, arguments will be presented in favour of separating biological plausibility from energy efficiency in attempting to create scalable neuromorphic solutions.
22
22
@@ -27,6 +27,5 @@ This talk will highlight, at a high level, the similarities in computational pri
27
27
- Q&A
28
28
29
29
**References**
30
-
[1]: Gu, A., Goel, K. and Ré, C., 2021. Efficiently modeling long sequences with structured state spaces. arXiv preprint arXiv:2111.00396.
31
-
[2]: Stan, M.I. and Rhodes, O., 2024. Learning long sequences in spiking neural networks. Scientific Reports, 14(1), p.21957.
32
-
[3]: Tay, Y., Dehghani, M., Abnar, S., Shen, Y., Bahri, D., Pham, P., Rao, J., Yang, L., Ruder, S. and Metzler, D., 2020. Long range arena: A benchmark for efficient transformers. arXiv preprint arXiv:2011.04006.
30
+
[1]: Stan, M.I. and Rhodes, O., 2024. Learning long sequences in spiking neural networks. Scientific Reports, 14(1), p.21957.
31
+
[2]: Tay, Y., Dehghani, M., Abnar, S., Shen, Y., Bahri, D., Pham, P., Rao, J., Yang, L., Ruder, S. and Metzler, D., 2020. Long range arena: A benchmark for efficient transformers. arXiv preprint arXiv:2011.04006.
0 commit comments