Skip to content

Commit f2c1866

Browse files
committed
DOC: improve fact linear + add fact. embedding
1 parent 5ffe91e commit f2c1866

File tree

2 files changed

+39
-2
lines changed

2 files changed

+39
-2
lines changed
Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
Factorized embedding layers
2+
===========================
3+
4+
In TensorLy-Torch, we also provide out-of-the-box tensorized embedding layers.
5+
6+
Just as for the case of factorized linear, you can either create a factorized embedding from scratch, here automatically determine the
7+
input and output tensorized shapes, to have 3 dimensions each:
8+
9+
.. code-block:: python
10+
11+
import tltorch
12+
import torch
13+
14+
from_embedding = tltorch.FactorizedEmbedding(num_embeddings, embedding_dim, auto_reshape=True, d=3, rank=0.4)
15+
16+
17+
Or, you can create it by decomposing an existing embedding layer:
18+
19+
from_embedding = tltorch.FactorizedEmbedding.from_embedding(embedding_layer, auto_reshape=True,
20+
factorization='blocktt', n_tensorized_modes=3, rank=0.4)

doc/user_guide/tensorized_linear.rst

Lines changed: 19 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,22 +24,39 @@ Now, imagine you already have a linear layer:
2424
2525
linear = torch.nn.Linear(in_features=16, 10)
2626
27-
You can easily compress it into a tensorized linear layer:
27+
You can easily compress it into a tensorized linear layer: here we specify the shape to which to tensorize the weights,
28+
and use `rank=0.5`, which means automatically determine the rank so that the factorization uses approximately half the
29+
number of parameters.
2830

2931
.. code-block:: python
3032
31-
fact_linear = tltorch.FactorizedLinear.from_linear(linear, (4, 4), (2, 5), rank=0.5)
33+
fact_linear = tltorch.FactorizedLinear.from_linear(linear, auto_tensorize=False,
34+
in_tensorized_features=(4, 4), out_tensorized_features=(2, 5), rank=0.5)
3235
3336
37+
The tensorized weights will have the following shape:
38+
3439
.. parsed-literal::
3540
3641
torch.Size([4, 4, 2, 5])
3742
3843
44+
Note that you can also let TensorLy-Torch automatically determine the tensorization shape. In this case we just instruct it to
45+
find ``in_tensorized_features`` and ``out_tensorized_features`` to have length `2`:
46+
47+
.. code-block:: python
48+
49+
fact_linear = tltorch.FactorizedLinear.from_linear(linear, auto_tensorize=True, n_tensorized_modes=2, rank=0.5)
50+
51+
3952
You can also create tensorized layers from scratch:
4053

4154
.. code-block:: python
4255
4356
fact_linear = tltorch.FactorizedLinear(in_tensorized_features=(4, 4),
4457
out_tensorized_features=(2, 5),
4558
factorization='tucker', rank=0.5)
59+
60+
Finally, during the forward pass, you can reconstruct the full weights (``implementation='reconstructed'``) and perform a regular linear layer forward pass.
61+
ALternatively, you can let TensorLy-Torch automatically direction contract the input tensor with the *factors of the decomposition* (``implementation='factorized'``),
62+
which can be faster, particularly if you have a very small rank, e.g. very small factorization factors.

0 commit comments

Comments
 (0)