Skip to content

Commit 831fc3b

Browse files
authored
Merge pull request #44 from OptimalFoundation/development
[Patch] Minor (embarassing) import issue + remove setup.py/cfg + update deps
2 parents e27c12e + cb6b7aa commit 831fc3b

File tree

7 files changed

+48
-57
lines changed

7 files changed

+48
-57
lines changed

.gitignore

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,4 +13,6 @@ src/nadir.egg-info/*
1313

1414
nadir.egg-info/*
1515

16-
tests/data/*
16+
tests/data/*
17+
18+
build/*

README.md

Lines changed: 20 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -8,21 +8,22 @@
88

99
**Nadir** (pronounced _nay-di-ah_) is derived from the arabic word _nazir_, and means "the lowest point of a space". In optimisation problems, it is equivalent to the point of minimum. If you are a machine learning enthusiast, a data scientist or an AI practitioner, you know how important it is to use the best optimization algorithms to train your models. The purpose of this library is to help optimize machine learning models and enable them to reach the point of nadir in the appropriate context.
1010

11-
PyTorch is a popular machine learning framework that provides a flexible and efficient way of building and training deep neural networks. This library, Nadir, is built on top of PyTorch to provide high-performing general-purpose optimisation algorithms.
11+
**Nadir** follows the principles of Simplicity, Modularity and Composabilty. Read more in the [Core Philosophy](#core-philosophy) section.
1212

13-
# Table of Contents
13+
## Table of Contents
1414

1515
- [Nadir](#nadir)
1616
- [Table of Contents](#table-of-contents)
1717
- [Installation](#installation)
1818
- [Simple Usage](#simple-usage)
19+
- [Core Philosphy](#core-philosophy)
1920
- [Supported Optimisers](#supported-optimisers)
2021
- [Acknowledgements](#acknowledgements)
2122
- [Citation](#citation)
2223

2324

2425

25-
# Installation
26+
## Installation
2627

2728
You can either choose to install from the PyPI index, in the following manner:
2829

@@ -36,7 +37,7 @@ $ pip install git+https://github.com/Dawn-Of-Eve/nadir.git
3637
```
3738
**Note:** Installing from source might lead to a breaking package. It is recommended that you install from PyPI itself.
3839

39-
# Simple Usage
40+
## Simple Usage
4041

4142
```python
4243
import nadir as nd
@@ -52,7 +53,19 @@ optimizer = nd.SGD(model.parameters(), config)
5253
optimizer.step()
5354
```
5455

55-
# Supported Optimisers
56+
## Core Philosophy
57+
58+
`Nadir` was built to provide a sense of uniformity and integration that might be lacking in the optimisation community, based on the simple idea that optimisers are not islands. They are usually inheriting characteristics from other optimisers and they provide inspiration to other optimisers. So why not make optimisers inheritable, composible and modular objects?
59+
60+
The core concepts that each optimiser in `Nadir` follows are:
61+
62+
1. **Simplicity** is of key importance. We prefer readability and simplicity over performance. Experiment, test and verify what works and what does not with Nadir. Optimise and write custom fused kernels for your favorite optimisers after, for performance.
63+
64+
2. **Modularity** means that the each new optimiser should minimise on the extra new logic added by adding or editing only the parts that need editing. If you want to have a different momentum in Adam, you only change the function of Momentum after inheriting Adam. No need to write the entire code from scratch.
65+
66+
3. **Composibility** implies that we can take things from one optimiser and add them to another without much effort. You can build a optimiser that is the mix of RAdam and NAdam with the properties of AdaBelief, if you so desire! That's what makes this library really powerful.
67+
68+
## Supported Optimisers
5669

5770
| Optimiser | Paper |
5871
|:---------: |:-----: |
@@ -71,12 +84,12 @@ optimizer.step()
7184
| **AdaBelief**| https://arxiv.org/pdf/2010.07468v5.pdf |
7285
| **NAdam** | http://cs229.stanford.edu/proj2015/054_report.pdf |
7386

74-
# Acknowledgements
87+
## Acknowledgements
7588

7689
We would like to thank all the amazing contributors of this project who spent so much effort making this repositary awesome! :heart:
7790

7891

79-
# Citation
92+
## Citation
8093

8194
You can use the _Cite this repository_ button provided by Github or use the following bibtex:
8295

ROADMAP.md

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,14 @@
1+
### 0.2.0
2+
- [ ] Readily usable micro-test for checking on optimizers
3+
- [ ] Make Config API optional
4+
- [ ] Add unit tests for the build
5+
6+
7+
18
### 0.1.0
29

310
- [x] Readily usable testing script on MNIST
4-
- [ ] Implementation of SGD, Adam in PyPi Module of Nadir
5-
- [ ] Have BaseOptimiser class, BaseMomentumOptimiser and BaseAdaptiveOptimiser class
11+
- [x] Implementation of SGD, Adam in PyPi Module of Nadir
12+
- [x] Have BaseOptimiser class, BaseMomentumOptimiser and BaseAdaptiveOptimiser class
13+
14+

pyproject.toml

Lines changed: 12 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,13 @@ build-backend = "setuptools.build_meta"
44

55
[project]
66
name = "nadir"
7-
version = "0.1.1"
87
authors = [
98
{ name="Bhavnick Minhas", email="bhavnicksm@gmail.com" },
109
]
1110
maintainers = [
1211
{ name = "Bhavnick Minhas", email="bhavnicksm@gmail.com"},
1312
]
14-
description = "Nadir is a library of bleeding-edge DL optimisers built for speed and functionality in PyTorch for researchers"
13+
description = "Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻"
1514
readme = "README.md"
1615
requires-python = ">=3.7"
1716
classifiers = [
@@ -23,16 +22,23 @@ classifiers = [
2322
license = { text = "Apache 2.0"}
2423
dependencies = [
2524
"torch>=1.13.1",
25+
]
26+
dynamic = ["version"]
27+
28+
[project.optional-dependencies]
29+
test = [
2630
"torchvision>=0.14.1",
2731
"tqdm",
2832
"wandb"
2933
]
3034

3135
[project.urls]
32-
"Homepage" = "https://github.com/Dawn-Of-Eve/nadir"
33-
"Bug Tracker" = "https://github.com/Dawn-Of-Eve/nadir/issues"
34-
36+
"Homepage" = "https://github.com/OptimalFoundation/nadir"
37+
"Bug Tracker" = "https://github.com/OptimalFoundation/nadir/issues"
3538

3639
[tools.setuptools]
3740
packages = ["nadir"]
38-
package-dir = {"" = "src"}
41+
package-dir = {"" = "src"}
42+
43+
[tool.setuptools.dynamic]
44+
version = { attr = "nadir.__version__" }

setup.cfg

Lines changed: 0 additions & 34 deletions
This file was deleted.

setup.py

Lines changed: 0 additions & 3 deletions
This file was deleted.

src/nadir/radam.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -17,9 +17,7 @@
1717
import torch
1818
import math
1919

20-
from .base import BaseOptimizer
21-
from .base import BaseConfig
22-
20+
from .adam import Adam, AdamConfig
2321

2422
__all__ = ['RadamConfig', 'Radam']
2523

@@ -32,7 +30,7 @@ class RadamConfig(AdamConfig):
3230
weight_decay : float = 0.
3331

3432
class Radam(Adam):
35-
def __init__ (self, params, config : LionConfig = LionConfig()):
33+
def __init__ (self, params, config : AdamConfig = AdamConfig()):
3634
super().__init__(params, config)
3735
self.config = config
3836

0 commit comments

Comments
 (0)