Skip to content

Commit fe7f7e9

Browse files
committed
more formatting fixes to A1
1 parent 09ab980 commit fe7f7e9

File tree

1 file changed

+81
-61
lines changed
  • content/assignments/Assignment_1:Hopfield_Networks

1 file changed

+81
-61
lines changed

content/assignments/Assignment_1:Hopfield_Networks/README.md

Lines changed: 81 additions & 61 deletions
Original file line numberDiff line numberDiff line change
@@ -18,37 +18,43 @@ You should start by reading [Amit et al. (1985)](https://www.dropbox.com/scl/fi/
1818

1919
### 1. Implement Memory Storage and Retrieval
2020

21-
**Objective:** Write functions that implement the core operations of a Hopfield network.
21+
#### Objective
2222

23-
- **Memory Storage:** Implement the Hebbian learning rule to compute the weight matrix, given a set of network configurations (memories). This is described in **Equation 1.5** of the paper:
23+
Write functions that implement the core operations of a Hopfield network.
2424

25-
Let \$p\$ be the number of patterns and \$\xi_i^\mu \in \{-1, +1\}\$ the value of neuron \$i\$ in pattern \$\mu\$. The synaptic coupling between neurons \$i\$ and \$j\$ is:
25+
#### Memory Storage
2626

27-
$$
28-
J_{ij} = \sum_{\mu=1}^p \xi_i^\mu \xi_j^\mu
29-
$$
27+
Implement the Hebbian learning rule to compute the weight matrix, given a set of network configurations (memories). This is described in *Equation 1.5* of the paper:
3028

31-
Note that the matrix is symmetric (\$J_{ij} = J_{ji}\$), and there are no self-connections by definition (\$J_{ii} = 0\$).
29+
Let $p$ be the number of patterns and $\xi_i^\mu \in \{-1, +1\}$ the value of neuron $i$ in pattern $\mu$. The synaptic coupling between neurons $i$ and $j$ is:
3230

33-
- **Memory Retrieval:** Implement the retrieval rule using **Equation 1.3** and surrounding discussion. At each time step, each neuron updates according to its **local field**:
31+
$$
32+
J_{ij} = \sum_{\mu=1}^p \xi_i^\mu \xi_j^\mu
33+
$$
34+
35+
Note that the matrix is symmetric ($J_{ij} = J_{ji}$), and there are no self-connections by definition ($J_{ii} = 0$).
3436

35-
$$
36-
h_i = \sum_{j=1}^N J_{ij} S_j
37-
$$
37+
#### Memory Retrieval
38+
39+
Implement the retrieval rule using *Equation 1.3* and surrounding discussion. At each time step, each neuron updates according to its local field:
40+
41+
$$
42+
h_i = \sum_{j=1}^N J_{ij} S_j
43+
$$
3844

39-
The neuron updates its state to align with the sign of the field:
45+
Each neuron updates its state by aligning with the sign of the field:
4046

41-
$$
42-
S_i(t+1) = \text{sign}(h_i(t)) = \text{sign} \left( \sum_{j} J_{ij} S_j(t) \right)
43-
$$
47+
$$
48+
S_i(t+1) = \text{sign}(h_i(t)) = \text{sign} \left( \sum_{j} J_{ij} S_j(t) \right)
49+
$$
4450

45-
Here \$S_i \in \{-1, +1\}\$ is the current state of neuron \$i\$.
51+
Here, $S_i \in \{-1, +1\}$ is the current state of neuron $i$.
4652

4753
---
4854

4955
### 2. Test with a Small Network
5056

51-
Encode the following test memories in a Hopfield network with \$N = 5\$ neurons:
57+
Encode the following test memories in a Hopfield network with $N = 5$ neurons:
5258

5359
$$
5460
\xi^1 = [+1, -1, +1, -1, +1] \\
@@ -57,7 +63,7 @@ $$
5763

5864
- Store these memories using the Hebbian rule.
5965
- Test retrieval by presenting the network with noisy versions (e.g., flipping a sign, or setting some entries to 0).
60-
- Briefly discuss your observations. You can write a few sentences, sketch/code a figure, or combine both.
66+
- Briefly discuss your observations.
6167

6268
Questions to consider:
6369

@@ -70,44 +76,51 @@ Questions to consider:
7076

7177
### 3. Evaluate Storage Capacity
7278

73-
**Objective:** Determine how memory recovery degrades as you vary:
79+
#### Objective
7480

75-
- **Network Size** (number of neurons)
76-
- **Number of Stored Memories**
81+
Determine how memory recovery degrades as you vary:
7782

78-
To generate \$m\$ memories \$\xi_1, \dots, \xi_m\$ for a network of size \$N\$, use:
83+
- **Network size** (number of neurons)
84+
- **Number of stored memories**
85+
86+
To generate $m$ memories $\xi_1, \dots, \xi_m$ for a network of size $N$, use:
7987

8088
```python
8189
import numpy as np
8290
xi = 2 * (np.random.rand(m, N) > 0.5) - 1
8391
```
8492

85-
**Method:**
93+
#### Method
8694

8795
- For each configuration, run multiple trials.
8896
- For each trial, measure whether **at least 99%** of the memory is recovered.
89-
90-
**Visualization 1:**
97+
98+
#### Visualization 1
99+
91100
Create a heatmap:
92-
- \$x\$-axis: network size
93-
- \$y\$-axis: number of stored memories
101+
102+
- $x$-axis: network size
103+
- $y$-axis: number of stored memories
94104
- Color: proportion of memories retrieved with ≥99% accuracy
95105

96-
**Visualization 2:**
97-
Plot the expected number of accurately retrieved memories vs. network size:
106+
#### Visualization 2
107+
108+
Plot the expected number of accurately retrieved memories vs. network size.
98109

99110
Let:
100-
- \$P[m, N] \in [0, 1]\$: proportion of \$m\$ memories accurately retrieved in a network of size \$N\$
101-
- \$\mathbb{E}[R_N]\$: expected number of successfully retrieved memories
111+
112+
- $P[m, N] \in [0, 1]$: proportion of $m$ memories accurately retrieved in a network of size $N$
113+
- $\mathbb{E}[R_N]$: expected number of successfully retrieved memories
102114

103115
Then:
116+
104117
$$
105118
\mathbb{E}[R_N] = \sum_{m=1}^{M} m \cdot P[m, N]
106119
$$
107120

108-
Where \$M\$ is the maximum number of memories tested.
121+
Where $M$ is the maximum number of memories tested.
109122

110-
**Follow-Up:**
123+
#### Follow-Up
111124

112125
- What relationship (if any) emerges between network size and capacity?
113126
- Can you develop rules or intuitions that help predict a network’s capacity?
@@ -116,59 +129,65 @@ Where \$M\$ is the maximum number of memories tested.
116129

117130
### 4. Simulate Cued Recall
118131

119-
**Objective:** Evaluate how the network performs associative recall when only a **cue** is presented.
132+
#### Objective
133+
134+
Evaluate how the network performs associative recall when only a **cue** is presented.
120135

121136
#### Setup: A–B Pair Structure
122137

123138
- Each memory consists of two parts:
124-
- First half: **Cue** (\$A\$)
125-
- Second half: **Response** (\$B\$)
139+
- First half: **Cue** ($A$)
140+
- Second half: **Response** ($B$)
126141

127-
If \$N\$ is odd:
128-
- Let cue length = \$\lfloor N/2 \rfloor\$
129-
- Let response length = \$\lceil N/2 \rceil\$
142+
If $N$ is odd:
143+
- Let cue length = $\lfloor N/2 \rfloor$
144+
- Let response length = $\lceil N/2 \rceil$
130145

131146
Each full memory:
147+
132148
$$
133149
\xi^\mu = \begin{bmatrix} A^\mu \\ B^\mu \end{bmatrix}
134150
$$
135151

136152
#### Simulation Procedure
137153

138-
For each trial:
139-
140-
1. **Choose a memory** \$\xi^\mu\$
141-
2. **Construct initial state** \$x\$:
142-
- Cue half: set to \$A^\mu\$
154+
1. **Choose a memory** $\xi^\mu$
155+
2. **Construct initial state** $x$:
156+
- Cue half: set to $A^\mu$
143157
- Response half: set to 0
144158
3. **Evolve the network** using the update rule:
159+
145160
$$
146161
x_i \leftarrow \text{sign} \left( \sum_j J_{ij} x_j \right)
147162
$$
163+
148164
- Optionally: **clamp** the cue (i.e., hold cue values fixed)
149165
4. **Evaluate success**:
150-
- Compare recovered response to \$B^\mu\$
166+
- Compare recovered response to $B^\mu$
151167
- Mark as successful if ≥99% of bits match:
168+
152169
$$
153170
\frac{1}{|B|} \sum_{i \in \text{response}} \mathbb{1}[x^*_i = B^\mu_i] \geq 0.99
154171
$$
155172

156173
#### Analysis
157174

158-
- Repeat across many \$A\$\$B\$ pairs
159-
- For each network size \$N\$, compute the **expected number** of correctly retrieved responses
160-
- Plot this value as a function of \$N\$
175+
- Repeat across many $A$–$B$ pairs
176+
- For each network size $N$, compute the expected number of correctly retrieved responses
177+
- Plot this value as a function of $N$
161178

162179
#### Optional Extensions
163180

164181
- Compare performance with and without clamping the cue
165-
- Try cueing with noisy or partial versions of \$A\$
182+
- Try cueing with noisy or partial versions of $A$
166183

167184
---
168185

169186
### 5. Simulate Contextual Drift
170187

171-
**Objective:** Investigate how gradual changes in **context** influence which memories are recalled.
188+
#### Objective
189+
190+
Investigate how gradual changes in **context** influence which memories are recalled.
172191

173192
#### Setup: Item–Context Representation
174193

@@ -177,40 +196,41 @@ For each trial:
177196
- First 50 neurons: **Item**
178197
- Last 50 neurons: **Context**
179198

180-
Create a **sequence of 10 memories**:
199+
Create a sequence of 10 memories:
200+
181201
$$
182202
\xi^t = \begin{bmatrix} \text{item}^t \\ \text{context}^t \end{bmatrix}
183203
$$
184204

185205
Context drift:
186206

187-
- Set \$\text{context}^1\$ randomly
188-
- For each subsequent \$\text{context}^{t+1}\$, copy \$\text{context}^t\$ and flip ~5% of the bits
207+
- Set $\text{context}^1$ randomly
208+
- For each subsequent $\text{context}^{t+1}$, copy $\text{context}^t$ and flip ~5% of the bits
189209

190210
#### Simulation Procedure
191211

192212
1. Store all 10 memories in the network.
193-
2. For each memory \$i = 1, \dots, 10\$:
194-
- Cue the network with \$\text{context}^i\$
213+
2. For each memory $i = 1, \dots, 10$:
214+
- Cue the network with $\text{context}^i$
195215
- Set item neurons to 0
196216
- Run until convergence
197-
- For each stored memory \$j\$, compare recovered item to \$\text{item}^j\$
198-
- If ≥99% of bits match, record \$j\$ as retrieved
199-
- Record \$\Delta = j - i\$ (relative offset)
217+
- For each stored memory $j$, compare recovered item to $\text{item}^j$
218+
- If ≥99% of bits match, record $j$ as retrieved
219+
- Record $\Delta = j - i$ (relative offset)
200220

201221
#### Analysis
202222

203223
- Repeat the procedure (e.g., 100 trials)
204-
- For each \$\Delta \in [-9, +9]\$, compute:
224+
- For each $\Delta \in [-9, +9]$, compute:
205225
- Probability of retrieval
206226
- 95% confidence interval
207227

208228
#### Visualization
209229

210230
Create a line plot:
211231

212-
- \$x\$-axis: Relative position \$\Delta\$
213-
- \$y\$-axis: Retrieval probability
232+
- $x$-axis: Relative position $\Delta$
233+
- $y$-axis: Retrieval probability
214234
- Error bars: 95% confidence intervals
215235

216236
Write a brief interpretation of the observed pattern.

0 commit comments

Comments
 (0)