Skip to content

Commit a7f0779

Browse files
authored
Merge pull request #13 from lucidrains/vector-gating
allow one to set their own path to cache directory with environmental…
2 parents 621aeaa + 829e7b2 commit a7f0779

File tree

3 files changed

+10
-3
lines changed

3 files changed

+10
-3
lines changed

README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -520,6 +520,12 @@ Or you can try deleting the cache directory, which should exist at
520520
$ rm -rf ~/.cache.equivariant_attention
521521
```
522522

523+
You can also designate your own directory where you want the caches to be stored, in the case that the default directory may have permission issues
524+
525+
```bash
526+
CACHE_PATH=./path/to/my/cache python train.py
527+
```
528+
523529
## Testing
524530

525531
```bash

se3_transformer_pytorch/basis.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,12 +7,13 @@
77
from contextlib import contextmanager
88

99
from se3_transformer_pytorch.irr_repr import irr_repr, spherical_harmonics
10-
from se3_transformer_pytorch.utils import torch_default_dtype, cache_dir, exists, to_order
10+
from se3_transformer_pytorch.utils import torch_default_dtype, cache_dir, exists, default, to_order
1111
from se3_transformer_pytorch.spherical_harmonics import clear_spherical_harmonics_cache
1212

1313
# constants
1414

15-
CACHE_PATH = os.path.expanduser('~/.cache.equivariant_attention') if not exists(os.environ.get('CLEAR_CACHE')) else None
15+
CACHE_PATH = default(os.getenv('CACHE_PATH'), os.path.expanduser('~/.cache.equivariant_attention'))
16+
CACHE_PATH = CACHE_PATH if not exists(os.environ.get('CLEAR_CACHE')) else None
1617

1718
# todo (figure ot why this was hard coded in official repo)
1819

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
name = 'se3-transformer-pytorch',
55
packages = find_packages(),
66
include_package_data = True,
7-
version = '0.8.7',
7+
version = '0.8.8',
88
license='MIT',
99
description = 'SE3 Transformer - Pytorch',
1010
author = 'Phil Wang',

0 commit comments

Comments
 (0)