Skip to content

Commit 4ac1149

Browse files
committed
Improved documentation + fact. tensor user-guide
1 parent e150f77 commit 4ac1149

File tree

4 files changed

+130
-2
lines changed

4 files changed

+130
-2
lines changed

doc/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@
2929
It comes with all batteries included and tries to make it as easy as possible to use tensor methods within your deep networks.
3030

3131
- **Leverage structure in your data**: with tensor layers, you can easily leverage the structure in your data, through :ref:`TRL <trl_ref>`, :ref:`TCL <tcl_ref>`, :ref:`Factorized convolutions <factorized_conv_ref>` and more!
32+
- **Factorized tensors** as first class citizens: you can transparently directly create, manipulate and index factorized tensors and regular (dense) pytorch tensors alike!
3233
- **Built-in tensor layers**: all you have to do is import tensorly torch and include the layers we provide directly within your PyTorch models!
3334
- **Initialization**: initializing tensor decompositions can be tricky. We take care of it all, whether you want to initialize randomly using our :ref:`init_ref` module or from a pretrained layer.
3435
- **Tensor hooks**: you can easily augment your architectures with our built-in :mod:`Tensor Hooks <tltorch.tensor_hooks>`. Robustify your network with Tensor Dropout and automatically select the rank end-to-end with L1 Regularization!

doc/user_guide/factorized_tensors.rst

Lines changed: 127 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,127 @@
1+
Factorized tensors
2+
==================
3+
4+
The core concept in TensorLy-Torch is that of *factorized tensors*.
5+
We provide a :class:`~tltorch.FactorizedTensor` class that can be used just like any `PyTorch.Tensor` but
6+
provides all tensor factorization through one, simple API.
7+
8+
9+
Creating factorized tensors
10+
---------------------------
11+
12+
You can create a new factorized tensor easily:
13+
14+
The signature is:
15+
16+
.. code-block:: python
17+
18+
factorized_tensor = FactorizedTensor.new(shape, rank, factorization)
19+
20+
For instance, to create a tensor in Tucker form, that has half the parameters of a dense (non-factorized) tensor of the same shape, you would simply write:
21+
22+
.. code-block:: python
23+
24+
tucker_tensor = FactorizedTensor.new(shape, rank=0.5, factorization='tucker')
25+
26+
Since TensorLy-Torch builds on top of TensorLy, it also comes with tensor decomposition out-of-the-box.
27+
To initialize a factorized tensor in CP (Canonical-Polyadic) form, also known as Parafac, or Kruskal tensor,
28+
with 1/10th of the parameters, you can simply write:
29+
30+
.. code-block:: python
31+
32+
cp_tensor = FactorizedTensor.new(dense_tensor, rank=0.1, factorization='CP')
33+
34+
35+
Manipulating factorized tensors
36+
-------------------------------
37+
38+
The first thing you want to do, if you created a new tensor from scratch (by using the ``new`` method), is to initialize it,
39+
e.g. so that the element of the reconstruction approximately follow a Gaussian distribution:
40+
41+
.. code-block:: python
42+
43+
cp_tensor.normal_(mean=0, std=0.02)
44+
45+
You can even use PyTorch's functions! This works:
46+
47+
.. code-block:: python
48+
49+
from torch.nn import init
50+
51+
init.kaiming_normal(cp_tensor)
52+
53+
Finally, you can index tensors directly in factorized form, which will return another factorized tensor, whenever possible!
54+
55+
>>> cp_tensor[:2, :2]
56+
CPTensor(shape=(2, 2, 2), rank=2)
57+
58+
If not possible, a dense tensor will be returned:
59+
60+
61+
>>> cp_tensor[2, 3, 1]
62+
tensor(0.0250, grad_fn=<SumBackward0>)
63+
64+
65+
Note how, above, indexing tracks gradients as well!
66+
67+
Tensorized tensors
68+
==================
69+
70+
In addition to tensor in factorized forms, TensorLy-Torch provides out-of-the-box for **Tensorized** tensors.
71+
The most common case is that of tensorized matrices, where a matrix is first *tensorized*, i.e. reshaped into
72+
a higher-order tensor which is then decomposed and stored in factorized form.
73+
74+
A commonly used tensorized tensor is the tensor-train matrix (also known as Matrix-Product Operator in quantum physics),
75+
or, in general, Block-TT.
76+
77+
Creation
78+
--------
79+
80+
You can create one in TensorLy-Torch, from a matrix, just as easily as a regular tensor, using the :class:`tltorch.TensorizedTensor` class,
81+
with the following signature:
82+
83+
.. code-block:: python
84+
85+
TensorizedTensor.from_matrix(matrix, tensorized_row_shape, tensorized_column_shape, rank)
86+
87+
where tensorized_row_shape and tensorized_column_shape indicate the shape to which to tensorize the row and column size of the given matrix.
88+
For instance, if you have a matrix of size 16x21, you could use tensorized_row_shape=(4, 4) and tensorized_column_shape=(3, 7).
89+
90+
91+
In general, you can tensorize any tensor, not just matrices, even with batched modes (dimensions)!
92+
93+
.. code-block:: python
94+
95+
tensorized_tensor = TensorizedTensor.new(tensorized_shape, rank, factorization)
96+
97+
98+
``tensorized_shape`` is a nested tuple, in which an int represents a batched mode, and a tuple a tensorized mode.
99+
100+
For instance, a batch of 5 matrices of size 16x21 could be tensorized into
101+
a batch of 5 tensorized matrices of size (4x4)x(3x7), in the BlockTT form. In code, you would do this using
102+
103+
.. code-block:: python
104+
105+
tensorized_tensor = TensorizedTensor.from_tensor(tensor, (5, (4, 4), (3, 7)), rank=0.7, factorization='BlockTT')
106+
107+
You can of course tensorize any size tensors, e.g. a batch of 5 matrices of size 8x27 can be tensorized into:
108+
109+
>>> ftt = tltorch.TensorizedTensor.new((5, (2, 2, 2), (3, 3, 3)), rank=0.5, factorization='BlockTT')
110+
111+
This returns a tensorized tensor, stored in decomposed form:
112+
>>> ftt
113+
BlockTT(shape=[5, 8, 27], tensorized_shape=(5, (2, 2, 2), (3, 3, 3)), rank=[1, 20, 20, 1])
114+
115+
Manipulation
116+
-------------
117+
118+
As for factorized tensors, you can directly index them:
119+
120+
>>> ftt[2]
121+
BlockTT(shape=[8, 27], tensorized_shape=[(2, 2, 2), (3, 3, 3)], rank=[1, 20, 20, 1])
122+
123+
>>> ftt[0, :2, :2]
124+
tensor([[-0.0009, 0.0004],
125+
[ 0.0007, 0.0003]], grad_fn=<SqueezeBackward0>)
126+
127+
Again, notice that gradients are tracked and all operations on factorized and tensorized tensors are back-propagatable!

doc/user_guide/tensorized_linear.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,4 +59,4 @@ You can also create tensorized layers from scratch:
5959
6060
Finally, during the forward pass, you can reconstruct the full weights (``implementation='reconstructed'``) and perform a regular linear layer forward pass.
6161
ALternatively, you can let TensorLy-Torch automatically direction contract the input tensor with the *factors of the decomposition* (``implementation='factorized'``),
62-
which can be faster, particularly if you have a very small rank, e.g. very small factorization factors.
62+
which can be faster, particularly if you have a very small rank, e.g. very small factorization factors.

tltorch/factorized_tensors/core.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -545,7 +545,7 @@ def init_from_matrix(self, matrix, **kwargs):
545545
return self.init_from_tensor(tensor, **kwargs)
546546

547547
def __repr__(self):
548-
msg = f'{self.__class__.__name__}, shape={self.shape}, tensorized_shape={self.tensorized_shape}, '
548+
msg = f'{self.__class__.__name__}(shape={self.shape}, tensorized_shape={self.tensorized_shape}, '
549549
msg += f'rank={self.rank})'
550550
return msg
551551

0 commit comments

Comments
 (0)