Skip to content

Commit e35db12

Browse files
committed
converted to mathjax for proper rendering in jupyterbook
1 parent bc6ac14 commit e35db12

File tree

1 file changed

+34
-34
lines changed
  • content/assignments/Assignment_1:Hopfield_Networks

1 file changed

+34
-34
lines changed

content/assignments/Assignment_1:Hopfield_Networks/README.md

Lines changed: 34 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -22,13 +22,13 @@ You should start by reading [Amit et al. (1985)](https://www.dropbox.com/scl/fi/
2222

2323
- **Memory Storage:** Implement the Hebbian learning rule to compute the weight matrix, given a set of network configurations (memories). This is described in **Equation 1.5** of the paper:
2424

25-
Let \( p \) be the number of patterns and \( \xi_i^\mu \in \{-1, +1\} \) the value of neuron \( i \) in pattern \( \mu \). The synaptic coupling between neurons \( i \) and \( j \) is:
25+
Let \$p\$ be the number of patterns and \$\xi_i^\mu \in \{-1, +1\}\$ the value of neuron \$i\$ in pattern \$\mu\$. The synaptic coupling between neurons \$i\$ and \$j\$ is:
2626

2727
$$
2828
J_{ij} = \sum_{\mu=1}^p \xi_i^\mu \xi_j^\mu
2929
$$
3030

31-
Note that the matrix is symmetric \( J_{ij} = J_{ji} \), and there are no self-connections by definition \( J_{ii} = 0 \).
31+
Note that the matrix is symmetric (\$J_{ij} = J_{ji}\$), and there are no self-connections by definition (\$J_{ii} = 0\$).
3232

3333
- **Memory Retrieval:** Implement the retrieval rule using **Equation 1.3** and surrounding discussion. At each time step, each neuron updates according to its **local field**:
3434

@@ -42,13 +42,13 @@ You should start by reading [Amit et al. (1985)](https://www.dropbox.com/scl/fi/
4242
S_i(t+1) = \text{sign}(h_i(t)) = \text{sign} \left( \sum_{j} J_{ij} S_j(t) \right)
4343
$$
4444

45-
Here \( S_i \in \{-1, +1\} \) is the current state of neuron \( i \).
45+
Here \$S_i \in \{-1, +1\}\$ is the current state of neuron \$i\$.
4646

4747
---
4848

4949
### 2. Test with a Small Network
5050

51-
Encode the following test memories in a Hopfield network with \( N = 5 \) neurons:
51+
Encode the following test memories in a Hopfield network with \$N = 5\$ neurons:
5252

5353
$$
5454
\xi^1 = [+1, -1, +1, -1, +1] \\
@@ -75,7 +75,7 @@ Questions to consider:
7575
- **Network Size** (number of neurons)
7676
- **Number of Stored Memories**
7777

78-
To generate \( m \) memories \( \xi_1, \dots, \xi_m \) for a network of size \( N \), use:
78+
To generate \$m\$ memories \$\xi_1, \dots, \xi_m\$ for a network of size \$N\$, use:
7979

8080
```python
8181
import numpy as np
@@ -89,28 +89,28 @@ xi = 2 * (np.random.rand(m, N) > 0.5) - 1
8989

9090
**Visualization 1:**
9191
Create a heatmap:
92-
- \( x \)-axis: network size
93-
- \( y \)-axis: number of stored memories
92+
- \$x\$-axis: network size
93+
- \$y\$-axis: number of stored memories
9494
- Color: proportion of memories retrieved with ≥99% accuracy
9595

9696
**Visualization 2:**
9797
Plot the expected number of accurately retrieved memories vs. network size:
9898

9999
Let:
100-
- \( P[m, N] \in [0, 1] \): proportion of \( m \) memories accurately retrieved in a network of size \( N \)
101-
- \( \mathbb{E}[R_N] \): expected number of successfully retrieved memories
100+
- \$P[m, N] \in [0, 1]\$: proportion of \$m\$ memories accurately retrieved in a network of size \$N\$
101+
- \$\mathbb{E}[R_N]\$: expected number of successfully retrieved memories
102102

103103
Then:
104104
$$
105105
\mathbb{E}[R_N] = \sum_{m=1}^{M} m \cdot P[m, N]
106106
$$
107107

108-
Where \( M \) is the maximum number of memories tested.
108+
Where \$M\$ is the maximum number of memories tested.
109109

110110
**Follow-Up:**
111111

112112
- What relationship (if any) emerges between network size and capacity?
113-
- Can you develop rules or intuitions that help predict a network's capacity?
113+
- Can you develop rules or intuitions that help predict a networks capacity?
114114

115115
---
116116

@@ -121,12 +121,12 @@ Where \( M \) is the maximum number of memories tested.
121121
#### Setup: A–B Pair Structure
122122

123123
- Each memory consists of two parts:
124-
- First half: **Cue** (\( A \))
125-
- Second half: **Response** (\( B \))
124+
- First half: **Cue** (\$A\$)
125+
- Second half: **Response** (\$B\$)
126126

127-
If \( N \) is odd:
128-
- Let cue length = \( \lfloor N/2 \rfloor \)
129-
- Let response length = \( \lceil N/2 \rceil \)
127+
If \$N\$ is odd:
128+
- Let cue length = \$\lfloor N/2 \rfloor\$
129+
- Let response length = \$\lceil N/2 \rceil\$
130130

131131
Each full memory:
132132
$$
@@ -137,32 +137,32 @@ $$
137137

138138
For each trial:
139139

140-
1. **Choose a memory** \( \xi^\mu \)
141-
2. **Construct initial state** \( x \):
142-
- Cue half: set to \( A^\mu \)
140+
1. **Choose a memory** \$\xi^\mu\$
141+
2. **Construct initial state** \$x\$:
142+
- Cue half: set to \$A^\mu\$
143143
- Response half: set to 0
144144
3. **Evolve the network** using the update rule:
145145
$$
146146
x_i \leftarrow \text{sign} \left( \sum_j J_{ij} x_j \right)
147147
$$
148148
- Optionally: **clamp** the cue (i.e., hold cue values fixed)
149149
4. **Evaluate success**:
150-
- Compare recovered response to \( B^\mu \)
150+
- Compare recovered response to \$B^\mu\$
151151
- Mark as successful if ≥99% of bits match:
152152
$$
153153
\frac{1}{|B|} \sum_{i \in \text{response}} \mathbb{1}[x^*_i = B^\mu_i] \geq 0.99
154154
$$
155155

156156
#### Analysis
157157

158-
- Repeat across many \( A \)\( B \) pairs
159-
- For each network size \( N \), compute the **expected number** of correctly retrieved responses
160-
- Plot this value as a function of \( N \)
158+
- Repeat across many \$A\$\$B\$ pairs
159+
- For each network size \$N\$, compute the **expected number** of correctly retrieved responses
160+
- Plot this value as a function of \$N\$
161161

162162
#### Optional Extensions
163163

164164
- Compare performance with and without clamping the cue
165-
- Try cueing with noisy or partial versions of \( A \)
165+
- Try cueing with noisy or partial versions of \$A\$
166166

167167
---
168168

@@ -184,33 +184,33 @@ $$
184184

185185
Context drift:
186186

187-
- Set \( \text{context}^1 \) randomly
188-
- For each subsequent \( \text{context}^{t+1} \), copy \( \text{context}^t \) and flip ~5% of the bits
187+
- Set \$\text{context}^1\$ randomly
188+
- For each subsequent \$\text{context}^{t+1}\$, copy \$\text{context}^t\$ and flip ~5% of the bits
189189

190190
#### Simulation Procedure
191191

192192
1. Store all 10 memories in the network.
193-
2. For each memory \( i = 1, \dots, 10 \):
194-
- Cue the network with \( \text{context}^i \)
193+
2. For each memory \$i = 1, \dots, 10\$:
194+
- Cue the network with \$\text{context}^i\$
195195
- Set item neurons to 0
196196
- Run until convergence
197-
- For each stored memory \( j \), compare recovered item to \( \text{item}^j \)
198-
- If ≥99% of bits match, record \( j \) as retrieved
199-
- Record \( \Delta = j - i \) (relative offset)
197+
- For each stored memory \$j\$, compare recovered item to \$\text{item}^j\$
198+
- If ≥99% of bits match, record \$j\$ as retrieved
199+
- Record \$\Delta = j - i\$ (relative offset)
200200

201201
#### Analysis
202202

203203
- Repeat the procedure (e.g., 100 trials)
204-
- For each \( \Delta \in [-9, +9] \), compute:
204+
- For each \$\Delta \in [-9, +9]\$, compute:
205205
- Probability of retrieval
206206
- 95% confidence interval
207207

208208
#### Visualization
209209

210210
Create a line plot:
211211

212-
- \( x \)-axis: Relative position \( \Delta \)
213-
- \( y \)-axis: Retrieval probability
212+
- \$x\$-axis: Relative position \$\Delta\$
213+
- \$y\$-axis: Retrieval probability
214214
- Error bars: 95% confidence intervals
215215

216216
Write a brief interpretation of the observed pattern.

0 commit comments

Comments
 (0)