Skip to content

Commit a70a191

Browse files
committed
Updated on 2024-08-27
1 parent a88941c commit a70a191

File tree

2 files changed

+28
-1
lines changed

2 files changed

+28
-1
lines changed

index.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ <h3>
3939
When?
4040
</h3>
4141
<p>
42-
Last time this was edited was 2024-08-26 (YYYY/MM/DD).
42+
Last time this was edited was 2024-08-27 (YYYY/MM/DD).
4343
</p>
4444
<small><a href="misc.html">misc</a></small>
4545
</body>

papers/list.json

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,31 @@
11
[
2+
{
3+
"title": "Meta-Learning of Neural Architectures for Few-Shot Learning",
4+
"author": "Thomas Elsken et al",
5+
"year": "2021",
6+
"topic": "NAS, meta-learning, few-shot, fsl",
7+
"venue": "Arxiv",
8+
"description": "The authors propose MetaNAS, which is the first method that fully integrates NAS with gradient-based meta-learning. Basically, they learn a method of joint learning gradient-based NAS methods like DARTS and meta-learning the architecture itself. Their goal is thus: meta-learn an architecture \\alpha_{meta} with corresponding meta-learned weights w_{meta}. When given a new task \\mathcal{T}_{i}, both \\alpha_{meta} and w_{meta} adapt quickly to \\mathcal{T}_{i} based on a few samples. One interesting technique they do is add a temperature term that is annealed to 0 over the course of task training; this is to help with sparsity of the mixture weights of the operations when using the DARTS search.",
9+
"link": "https://arxiv.org/pdf/1911.11090"
10+
},
11+
{
12+
"title": "MetAdapt: Meta-Learned Task-Adaptive Architecture for Few-Shot Classification",
13+
"author": "Sivan Doveh et al",
14+
"year": "2020",
15+
"topic": "NAS, meta-learning, few-shot, fsl",
16+
"venue": "Arxiv",
17+
"description": "The authors propose a method using a DARTS-like search for FSL architectures. \"Our goal is to learn a neural network where connections are controllable and adapt to the few-shot task with novel categories... However, unlike DARTS, our goal is not to learn a one time architecture to be used for all tasks... we need to make our architecture task adaptive so it would be able to quickly rewire for each new target task.\". Basically, they design a thing called a MetAdapt Controller that changes the connection in the main network according to some given task.",
18+
"link": "https://arxiv.org/pdf/1912.00412"
19+
},
20+
{
21+
"title": "MetAdapt: Meta-Learned Task-Adaptive Architecture for Few-Shot Classification",
22+
"author": "Sivan Doveh et al",
23+
"year": "2020",
24+
"topic": "NAS, meta-learning, few-shot, fsl",
25+
"venue": "Arxiv",
26+
"description": "The authors propose a method using a DARTS-like search for FSL architectures. \"Our goal is to learn a neural network where connections are controllable and adapt to the few-shot task with novel categories... However, unlike DARTS, our goal is not to learn a one time architecture to be used for all tasks... we need to make our architecture task adaptive so it would be able to quickly rewire for each new target task.\". Basically, they design a thing called a MetAdapt Controller that changes the connection in the main network according to some given task.",
27+
"link": "https://arxiv.org/pdf/1912.00412"
28+
},
229
{
330
"title": "Distilling the Knowledge in a Neural Network",
431
"author": "Geoffry Hinton et al",

0 commit comments

Comments
 (0)