Skip to content

Commit 8499168

Browse files
committed
update
1 parent 06eed29 commit 8499168

File tree

1 file changed

+18
-0
lines changed

1 file changed

+18
-0
lines changed

papers/list.json

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,22 @@
11
[
2+
{
3+
"title": "Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference",
4+
"author": "Benoit Jacob et al",
5+
"year": "2017",
6+
"topic": "quantization, quantization schemes, efficient inference, floating-point",
7+
"venue": "arxiv",
8+
"description": "The authors propose a quantization scheme that allows us to only use integer arithmetic to approximate fp computations in a neural network. They also describe a training approach that simulates the effect of quantization in the forward pass. Backprop still occurs, but all weights and biases are stored in fp. The forward prop pass then simulates quantized inference by rounding off using the quantization scheme they describe that changes fp to int.",
9+
"link": "https://arxiv.org/pdf/1712.05877"
10+
},
11+
{
12+
"title": "PACT: Parameterized Clipping Activation for Quantized Neural Networks",
13+
"author": "Jungwook Choi et al",
14+
"year": "2018",
15+
"topic": "quantization, clipping, activations",
16+
"venue": "ICLR",
17+
"description": "The authors present a method of quantization by clipping activations using a learnable parameter, alpha. They show that this can lead to lower decreases in accuracy compared to other quantization methods. They also note that activations have been hard to quantize compared to weights in the past. They also prove that PACT is as expressive as ReLU, by showing it can reach the same solution as ReLU if SGD is used. They also describe the hardware benefits that can be incurred.",
18+
"link": "https://arxiv.org/pdf/1805.06085"
19+
},
220
{
321
"title": "SMASH: One-Shot Model Architecture Search through Hypernetworks",
422
"author": "Andrew Brock et al",

0 commit comments

Comments
 (0)