Skip to content

Commit 9de7f57

Browse files
committed
feat: contributors: Update Matei Stan info and add photo
This commit updates Matei Stan's contributor information with his LinkedIn profile and adds his photo. Also includes his photo for the student talks section. Fixes minor issues with speaker bio and references in student talk.
1 parent a2ca4f7 commit 9de7f57

File tree

5 files changed

+12
-12
lines changed

5 files changed

+12
-12
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,3 +26,4 @@ yarn.lock
2626
content/**/*-og-*.jpg
2727
static/images/og-image-*.jpg
2828
tmp/og-cache-manifest.json
29+
/data/github_stars.json
Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
---
22
title: "Matei Stan"
33
description: "Matei Stan is a PhD student at the University of Manchester, focusing on State Space Models (SSMs) and their application in Spiking Neural Networks for long-range sequential tasks."
4-
image: "matei-stan.jpg" # Ask him for this photo
4+
image: "matei-stan.jpg"
55
social:
66
- icon: "fa-brands fa-linkedin"
7-
link: "https://www.linkedin.com/in/username" # Ask for his LinkedIn/GitHub/Website URL
7+
link: "https://www.linkedin.com/in/matei-stan"
88
title: "linkedin"
99
draft: false
1010
---
@@ -13,4 +13,4 @@ Matei Stan is a third-year PhD student in the Department of Computer Science at
1313
Manchester, UK. He is supervised by Dr Oliver Rhodes in the Advanced Processor Technologies
1414
(APT) group. In his PhD work, Matei has primarily focused on the applications of deep State Space
1515
Models (SSMs), such as S4, in neuromorphic computing, and their potential in scaling
16-
energy-efficient algorithms for long-range sequential tasks.
16+
energy-efficient algorithms for long-range sequential tasks.
539 KB
Loading

content/neuromorphic-computing/student-talks/learning-long-sequences-in-snns/index.md

Lines changed: 8 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -3,20 +3,20 @@ title: "Student Talk: Learning Long Sequences in Spiking Neural Networks"
33
author:
44
- "Matei Stan"
55
date: 2025-07-27
6-
start_time: "08:00"
7-
end_time: "09:00"
6+
start_time: "08:30"
7+
end_time: "09:45"
88
time_zone: "EST"
99
description: "Explore how State Space Models (SSMs) combined with Spiking Neural Networks (SNNs) can outperform Transformers on long-sequence tasks, and learn about a novel feature mixing layer that challenges assumptions about binary activations."
1010
upcoming: true
11-
upcoming_url: "https://link.to.the.event" # Ask him for this
11+
upcoming_url: "https://teams.microsoft.com/l/meetup-join/19%3Ameeting_OTBkNTY5MjgtMjE3Ni00OTFmLWEwNzktN2QwZTU1NWIxNDc2%40thread.v2/0?context=%7B%22Tid%22%3A%22c152cb07-614e-4abb-818a-f035cfa91a77%22%2C%22Oid%22%3A%223f444780-d657-4917-993e-0f42adeff90e%22%7D"
1212
video: ""
13-
image: "banner.png" # Ask him for a 1200x630px banner image
14-
speaker_photo: "matei-stan.jpg" # This should be the same as his contributor photo
13+
image: "banner.png"
14+
speaker_photo: "matei-stan.jpg"
1515
type: "student-talks"
1616
speaker_bio: "Matei Stan is a third-year PhD student in the Department of Computer Science at the University of Manchester, UK. He is supervised by Dr Oliver Rhodes in the Advanced Processor Technologies (APT) group. In his PhD work, Matei has primarily focused on the applications of deep State Space Models (SSMs), such as S4, in neuromorphic computing, and their potential in scaling energy-efficient algorithms for long-range sequential tasks."
1717
---
1818

19-
Matei’s published work, “Learning Long Sequences in Spiking Neural Networks” [2], systematically investigates, for the first time, the intersection of the State‑of‑The‑Art SSMs with Spiking Neural Networks (SNNs) for long‑range sequence modelling. Results suggest that SSM‑based SNNs can outperform the Transformer on all tasks of a well‑established long‑range sequence modelling benchmark - the “Long-Range Arena” [3]. It is also shown that the SSM‑based SNNs can outperform current State‑of‑The‑Art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as Large Language Models, on neuromorphic hardware for energy-efficient long-range sequence modelling.
19+
Matei’s published work, “Learning Long Sequences in Spiking Neural Networks” [1], systematically investigates, for the first time, the intersection of the State‑of‑The‑Art State Space Models (SSMs) with Spiking Neural Networks (SNNs) for long‑range sequence modelling. Results suggest that SSM‑based SNNs can outperform the Transformer on all tasks of a well‑established long‑range sequence modelling benchmark - the “Long-Range Arena” [2]. It is also shown that the SSM‑based SNNs can outperform current State‑of‑The‑Art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as Large Language Models, on neuromorphic hardware for energy-efficient long-range sequence modelling.
2020

2121
This talk will highlight, at a high level, the similarities in computational primitives between SSMs and the existing neuromorphic standards such as Leaky Integrate-and-Fire (LIF) neurons. It will also focus on the specific drawbacks brought about by the introduction of binary activations in SSMs, as well as the extent to which these can be mitigated by the development of more accurate surrogate gradient methods that account for non-differentiability. Finally, arguments will be presented in favour of separating biological plausibility from energy efficiency in attempting to create scalable neuromorphic solutions.
2222

@@ -27,6 +27,5 @@ This talk will highlight, at a high level, the similarities in computational pri
2727
- Q&A
2828

2929
**References**
30-
[1]: Gu, A., Goel, K. and Ré, C., 2021. Efficiently modeling long sequences with structured state spaces. arXiv preprint arXiv:2111.00396.
31-
[2]: Stan, M.I. and Rhodes, O., 2024. Learning long sequences in spiking neural networks. Scientific Reports, 14(1), p.21957.
32-
[3]: Tay, Y., Dehghani, M., Abnar, S., Shen, Y., Bahri, D., Pham, P., Rao, J., Yang, L., Ruder, S. and Metzler, D., 2020. Long range arena: A benchmark for efficient transformers. arXiv preprint arXiv:2011.04006.
30+
[1]: Stan, M.I. and Rhodes, O., 2024. Learning long sequences in spiking neural networks. Scientific Reports, 14(1), p.21957.
31+
[2]: Tay, Y., Dehghani, M., Abnar, S., Shen, Y., Bahri, D., Pham, P., Rao, J., Yang, L., Ruder, S. and Metzler, D., 2020. Long range arena: A benchmark for efficient transformers. arXiv preprint arXiv:2011.04006.
Loading

0 commit comments

Comments
 (0)