Skip to content

Convert activation functions to numpower #381

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 115 commits into
base: 3.0
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
115 commits
Select commit Hold shift + click to select a range
4f22784
first commit
SkibidiProduction Mar 25, 2025
392e5ab
Basic implementation
SkibidiProduction Mar 26, 2025
6a152f2
intermediate commit
SkibidiProduction Mar 28, 2025
9d8c5ce
Merge branch 'refs/heads/3.0' into convert-activation-functions-to-nu…
SkibidiProduction Apr 25, 2025
394c2a6
иbase changes
SkibidiProduction Jun 13, 2025
479f851
base changes
SkibidiProduction Jun 13, 2025
e5151a5
Renamed inherited final method
apphp Jun 15, 2025
6134448
Merge pull request #366 from apphp/SAM-1-fix-unit-tests-issues
SkibidiProduction Jun 15, 2025
c528e87
Refactored ELU class
apphp Jun 17, 2025
ac32b87
Added docker related files
apphp Jun 17, 2025
0438747
Fixed error on implements base class
apphp Jun 17, 2025
af4f899
Typo fix in class name
apphp Jun 17, 2025
f44a597
Refactored ELUTest
apphp Jun 17, 2025
7c7c999
Style fixes for ELU and ELUTest
apphp Jun 17, 2025
d405dbd
Style fixes for ELU and ELUTest
apphp Jun 17, 2025
f3371fe
Syntax fixes for ELUTest
apphp Jun 17, 2025
fab36f4
Syntax fixes for PHP8.4
apphp Jun 17, 2025
9cfdba6
Improved configuration for PHP-CS-Fixer
apphp Jun 17, 2025
1e6b857
Added PHP linter
apphp Jun 17, 2025
797da3d
Improved configuration for PHP-CS-Fixer
apphp Jun 17, 2025
3bb5166
Improved configuration for PHP-CS-Fixer
apphp Jun 17, 2025
cb4aa27
Improved configuration for phpstan
apphp Jun 17, 2025
5a15240
Merge pull request #368 from apphp/SAM-3-add-phpinter-and-improve-stan
SkibidiProduction Jun 17, 2025
e95ed9c
Initial implementation of GeLU with test
apphp Jun 17, 2025
58b0ca5
Added suppress for NDArray errors in ActivationFunctions
apphp Jun 17, 2025
764647b
Added suppress for NDArray errors in ActivationFunctions
apphp Jun 17, 2025
344a765
Removed named arguments from code
apphp Jun 18, 2025
0caa2f2
Simplifyed
apphp Jun 18, 2025
12ceaa4
Revert "Simplifyed"
apphp Jun 18, 2025
c4fc62e
Completed test for GeLU
apphp Jun 18, 2025
9271ecb
Renamed interface name
apphp Jun 18, 2025
e1314b0
Changed style for tests according to phpunit v12
apphp Jun 18, 2025
3d113a5
Removed comment
apphp Jun 18, 2025
0dc6bcf
Merge pull request #369 from apphp/SAM-4-implementation-of-gelu-function
SkibidiProduction Jun 18, 2025
13c9c40
Added hard sigmoid function
apphp Jun 18, 2025
1120aee
Removed charts
apphp Jun 18, 2025
0b80dc2
Added images for activation functions in docs
apphp Jun 19, 2025
9e20e55
Extended info for Leaky ReLU
apphp Jun 19, 2025
7851ffd
Extended info for ReLU
apphp Jun 19, 2025
977a606
Added doc for ReLU6
apphp Jun 19, 2025
d5976d1
Extended info for SELU
apphp Jun 19, 2025
62ea359
Extended info for Sigmoid
apphp Jun 19, 2025
5272f21
Extended info for activation function docs
apphp Jun 19, 2025
ac5a359
Extended info for SiLU
apphp Jun 19, 2025
f517914
Extended info for Softmax
apphp Jun 19, 2025
0a93aa1
Extended info for Softsign
apphp Jun 19, 2025
6ce5406
Extended info for Softplus
apphp Jun 19, 2025
31f646b
Extended info for thresholded relu
apphp Jun 19, 2025
90e4495
Typo fixes
apphp Jun 19, 2025
18817d2
Typo fixes
apphp Jun 19, 2025
4b796ce
Typo fixes
apphp Jun 19, 2025
81ccd8c
Merge pull request #370 from apphp/SAM-4-refactor-hard-sigmoid-function
SkibidiProduction Jun 19, 2025
0a30675
Merge pull request #371 from apphp/SAM-5-add-charts-for-activation-fu…
SkibidiProduction Jun 19, 2025
86be3a7
Fixes for Latex formulas
apphp Jun 19, 2025
1b3d739
Merge pull request #372 from apphp/SAM-5-add-charts-for-activation-fu…
SkibidiProduction Jun 20, 2025
06bd4fc
Implemented HardSiLU
apphp Jun 20, 2025
fcfc9e4
Typo fix for GELU
apphp Jun 20, 2025
0052750
Code style fixes
apphp Jun 20, 2025
4b438ac
Ignored errors moved to phpstan-baseline.neon
apphp Jun 20, 2025
56ca6e3
Fixed pow() precision for GELU
apphp Jun 20, 2025
2a5012b
Replace assertEqual with assertEqualsWithDelta
apphp Jun 20, 2025
619b7d0
Fixed pow() precision for GELU
apphp Jun 20, 2025
854433a
Merge pull request #373 from apphp/SAM-6-hard-silu
SkibidiProduction Jun 20, 2025
451dd2a
SAM-7 added HyperbolicTangent
apphp Jun 22, 2025
868830b
SAM-7 fix for function docblock
apphp Jun 22, 2025
43fb507
Fix for merge issues
apphp Jun 22, 2025
e16b4a4
SAM-8 implemented LeakyReLU
apphp Jun 22, 2025
c3f6530
SAM-8 style fixes for import classes
apphp Jun 22, 2025
287ec5f
SAM-8 changed LeakyReLUTest initial values
apphp Jun 22, 2025
eef5c57
SAM-8 typo fixes
apphp Jun 22, 2025
cd980ea
SAM-8 typo fixes
apphp Jun 22, 2025
d80916e
SAM-7 turned back function signature in docblock
apphp Jun 23, 2025
0025293
SAM-8 turned back function signature in docblock
apphp Jun 23, 2025
45361aa
SAM-8 fixed parameter name like in interface
apphp Jun 23, 2025
31a6e10
SAM-7 fixed parameter name like in interface
apphp Jun 23, 2025
2872f1f
Merge pull request #374 from apphp/SAM-7-HyperbolicTangent
SkibidiProduction Jun 23, 2025
eb864c2
Renamed parameter name like in interface
apphp Jun 23, 2025
5203c0a
SAM-7 differentiate parameter renamed
apphp Jun 23, 2025
ed71ea5
Refactoring Derivative sub-interfaces
apphp Jun 24, 2025
b0ce0ff
Refactoring ELU with IOBufferDerivative
apphp Jun 24, 2025
c84319e
Refactoring GELU with IBufferDerivative
apphp Jun 24, 2025
de4ef87
Refactoring HardSigmoid with IBufferDerivative
apphp Jun 24, 2025
6dd013f
Refactoring HardSigmoid with IBufferDerivative
apphp Jun 24, 2025
6020364
Refactoring HardSigmoid with IBufferDerivative
apphp Jun 24, 2025
e278894
Refactoring HardSiLU with IBufferDerivative
apphp Jun 25, 2025
87cb3b9
Refactoring HyperbolicTangent with IBufferDerivative
apphp Jun 25, 2025
edb02d0
Refactoring LeakyReLU with IBufferDerivative
apphp Jun 25, 2025
6c4d9f4
Typo fix in exception text
apphp Jun 25, 2025
39cb8a0
Merge pull request #375 from apphp/SAM-8-
SkibidiProduction Jun 29, 2025
fac54d0
Refactoring ReLU with IBufferDerivative
apphp Jun 29, 2025
2c2af8a
Style fixes
apphp Jun 29, 2025
6239d26
Refactoring ReLU6 with IBufferDerivative
apphp Jun 29, 2025
fc7f11e
Style fixes
apphp Jun 29, 2025
be0b9ff
Merge pull request #376 from apphp/SAM-9-ReLU-and-ReLU6
SkibidiProduction Jun 29, 2025
2bbfd3c
References fix
apphp Jun 30, 2025
5c2adf7
Refactoring SELU with IBufferDerivative
apphp Jun 30, 2025
7b0479a
Refactoring Sigmoid with OBufferDerivative
apphp Jun 30, 2025
9119f26
Merge pull request #377 from apphp/SAM-10-SELU
SkibidiProduction Jul 1, 2025
62f7af1
Init Refactoring SiLU with IBufferDerivative
apphp Jul 2, 2025
7b220c6
Refactoring SiLU with IBufferDerivative
apphp Jul 2, 2025
d05d9bf
code style fixes
apphp Jul 2, 2025
a8c311b
Refactoring multiply function
apphp Jul 2, 2025
c777bc0
Code optimization
apphp Jul 2, 2025
512719f
Code optimization
apphp Jul 2, 2025
5a1d2c4
Merge pull request #378 from apphp/SAM-11-SiLU
SkibidiProduction Jul 5, 2025
ed7610c
Replaced plots for Softmax function
apphp Jul 6, 2025
e02bfe5
Refactoring Softmax with OBufferDerivative
apphp Jul 6, 2025
dda3c71
Added more examples for Softmax
apphp Jul 7, 2025
4dd363e
Code cleanup for ActivationFunction
apphp Jul 7, 2025
32087ae
Update documentation for activation functions
apphp Jul 7, 2025
99a4ccd
Refactoring Softplus with IBufferDerivative
apphp Jul 7, 2025
8992e18
Merge pull request #379 from apphp/Sam-12-softmax-function
SkibidiProduction Jul 8, 2025
e6a6143
Refactoring Softsign with IBufferDerivative
apphp Jul 13, 2025
1cbad5c
Refactoring ThresholdedReLU with IBufferDerivative
apphp Jul 13, 2025
9e4a6b2
Merge pull request #380 from apphp/SAM-13-softsign
SkibidiProduction Jul 14, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 0 additions & 6 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,14 +8,8 @@ jobs:
runs-on: ${{ matrix.operating-system }}
strategy:
matrix:
<<<<<<< HEAD
operating-system: [windows-latest, ubuntu-latest, macos-latest]
php-versions: ['8.4']
=======
operating-system: [ubuntu-latest, macos-latest]
php-versions: ['8.0', '8.1', '8.2']
>>>>>>> master

steps:
- name: Checkout
uses: actions/checkout@v3
Expand Down
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -13,3 +13,12 @@ pyvenv.cfg
.idea
.vscode
.vs

# Cache files
/runtime/

# Docker related files
Dockerfile
/docker/
docker-compose.yml
Makefile
231 changes: 122 additions & 109 deletions .php-cs-fixer.dist.php
Original file line number Diff line number Diff line change
Expand Up @@ -3,115 +3,128 @@
use PhpCsFixer\Finder;
use PhpCsFixer\Config;

$finder = Finder::create()->in(__DIR__)
->exclude('docs');
$finder = Finder::create()
->exclude([
__DIR__ . '/docs/',
__DIR__ . '/vendor/',
])
->in(__DIR__)
->append([
__FILE__,
]);

$config = new Config();
$config
->setCacheFile(__DIR__ . '/runtime/.php-cs-fixer.cache')
->setRules(
[
'@PSR2' => true,
'@PHP84Migration' => true,
'align_multiline_comment' => true,
'array_syntax' => ['syntax' => 'short'],
'backtick_to_shell_exec' => true,
'binary_operator_spaces' => true,
'blank_lines_before_namespace' => true,
'blank_line_after_namespace' => true,
'blank_line_after_opening_tag' => true,
'blank_line_before_statement' => [
'statements' => [
'break', 'case', 'continue', 'declare', 'default', 'do', 'for',
'if', 'foreach', 'return', 'switch', 'try', 'while',
],
],
'cast_spaces' => ['space' => 'single'],
'class_attributes_separation' => true,
'combine_consecutive_issets' => true,
'combine_consecutive_unsets' => true,
'compact_nullable_type_declaration' => true,
'concat_space' => ['spacing' => 'one'],
'fully_qualified_strict_types' => true,
'increment_style' => ['style' => 'pre'],
'linebreak_after_opening_tag' => true,
'list_syntax' => ['syntax' => 'short'],
'lowercase_cast' => true,
'lowercase_static_reference' => true,
'magic_constant_casing' => true,
'magic_method_casing' => true,
'multiline_comment_opening_closing' => true,
'multiline_whitespace_before_semicolons' => [
'strategy' => 'no_multi_line',
],
'native_function_casing' => true,
'native_type_declaration_casing' => true,
'new_with_parentheses' => true,
'no_alternative_syntax' => true,
'no_blank_lines_after_class_opening' => true,
'no_blank_lines_after_phpdoc' => true,
'no_empty_statement' => true,
'no_extra_blank_lines' => true,
'no_leading_import_slash' => true,
'no_leading_namespace_whitespace' => true,
'no_mixed_echo_print' => ['use' => 'echo'],
'no_null_property_initialization' => true,
'no_short_bool_cast' => true,
'no_singleline_whitespace_before_semicolons' => true,
'no_spaces_around_offset' => true,
'no_superfluous_phpdoc_tags' => false,
'no_superfluous_elseif' => true,
'no_trailing_comma_in_singleline' => true,
'no_unneeded_control_parentheses' => true,
'no_unneeded_braces' => true,
'no_unset_cast' => true,
'no_unused_imports' => true,
'no_useless_else' => true,
'no_useless_return' => true,
'no_whitespace_before_comma_in_array' => true,
'no_whitespace_in_blank_line' => true,
'normalize_index_brace' => true,
'nullable_type_declaration_for_default_null_value' => true,
'object_operator_without_whitespace' => true,
'ordered_class_elements' => [
'order' => [
'use_trait', 'constant_public', 'constant_protected',
'constant_private', 'property_public_static', 'property_protected_static',
'property_private_static', 'property_public', 'property_protected',
'property_private', 'method_public_static', 'method_protected_static',
'method_private_static', 'construct', 'destruct', 'phpunit',
'method_public', 'method_protected', 'method_private', 'magic',
],
'sort_algorithm' => 'none',
],
'php_unit_fqcn_annotation' => true,
'php_unit_method_casing' => ['case' => 'camel_case'],
'phpdoc_add_missing_param_annotation' => ['only_untyped' => false],
'phpdoc_align' => ['align' => 'left'],
'phpdoc_line_span' => [
'const' => 'multi',
'method' => 'multi',
'property' => 'multi',
],
'phpdoc_no_access' => true,
'phpdoc_no_empty_return' => true,
'phpdoc_no_useless_inheritdoc' => true,
'phpdoc_order' => true,
'phpdoc_scalar' => true,
'phpdoc_single_line_var_spacing' => true,
'phpdoc_to_comment' => false,
'phpdoc_trim' => true,
'phpdoc_trim_consecutive_blank_line_separation' => true,
'phpdoc_var_without_name' => true,
'protected_to_private' => true,
'return_assignment' => false,
'return_type_declaration' => ['space_before' => 'one'],
'semicolon_after_instruction' => true,
'short_scalar_cast' => true,
'simplified_null_return' => true,
'single_quote' => true,
'single_line_comment_style' => true,
'ternary_operator_spaces' => true,
'ternary_to_null_coalescing' => true,
'type_declaration_spaces' => true,
'trim_array_spaces' => true,
'unary_operator_spaces' => true,
'whitespace_after_comma_in_array' => true,
]
)->setFinder($finder);

return $config->setRules([
'@PSR2' => true,
'align_multiline_comment' => true,
'array_syntax' => ['syntax' => 'short'],
'backtick_to_shell_exec' => true,
'binary_operator_spaces' => true,
'blank_lines_before_namespace' => true,
'blank_line_after_namespace' => true,
'blank_line_after_opening_tag' => true,
'blank_line_before_statement' => [
'statements' => [
'break', 'case', 'continue', 'declare', 'default', 'do', 'for',
'if', 'foreach', 'return', 'switch', 'try', 'while',
],
],
'cast_spaces' => ['space' => 'single'],
'class_attributes_separation' => true,
'combine_consecutive_issets' => true,
'combine_consecutive_unsets' => true,
'compact_nullable_type_declaration' => true,
'concat_space' => ['spacing' => 'one'],
'fully_qualified_strict_types' => true,
'increment_style' => ['style' => 'pre'],
'linebreak_after_opening_tag' => true,
'list_syntax' => ['syntax' => 'short'],
'lowercase_cast' => true,
'lowercase_static_reference' => true,
'magic_constant_casing' => true,
'magic_method_casing' => true,
'multiline_comment_opening_closing' => true,
'multiline_whitespace_before_semicolons' => [
'strategy' => 'no_multi_line',
],
'native_function_casing' => true,
'native_type_declaration_casing' => true,
'new_with_parentheses' => true,
'no_alternative_syntax' => true,
'no_blank_lines_after_class_opening' => true,
'no_blank_lines_after_phpdoc' => true,
'no_empty_statement' => true,
'no_extra_blank_lines' => true,
'no_leading_import_slash' => true,
'no_leading_namespace_whitespace' => true,
'no_mixed_echo_print' => ['use' => 'echo'],
'no_null_property_initialization' => true,
'no_short_bool_cast' => true,
'no_singleline_whitespace_before_semicolons' => true,
'no_spaces_around_offset' => true,
'no_superfluous_phpdoc_tags' => false,
'no_superfluous_elseif' => true,
'no_trailing_comma_in_singleline' => true,
'no_unneeded_control_parentheses' => true,
'no_unneeded_braces' => true,
'no_unset_cast' => true,
'no_unused_imports' => true,
'no_useless_else' => true,
'no_useless_return' => true,
'no_whitespace_before_comma_in_array' => true,
'no_whitespace_in_blank_line' => true,
'normalize_index_brace' => true,
'nullable_type_declaration_for_default_null_value' => true,
'object_operator_without_whitespace' => true,
'ordered_class_elements' => [
'order' => [
'use_trait', 'constant_public', 'constant_protected',
'constant_private', 'property_public_static', 'property_protected_static',
'property_private_static', 'property_public', 'property_protected',
'property_private', 'method_public_static', 'method_protected_static',
'method_private_static', 'construct', 'destruct', 'phpunit',
'method_public', 'method_protected', 'method_private', 'magic',
],
'sort_algorithm' => 'none',
],
'php_unit_fqcn_annotation' => true,
'php_unit_method_casing' => ['case' => 'camel_case'],
'phpdoc_add_missing_param_annotation' => ['only_untyped' => false],
'phpdoc_align' => ['align' => 'left'],
'phpdoc_line_span' => [
'const' => 'multi',
'method' => 'multi',
'property' => 'multi',
],
'phpdoc_no_access' => true,
'phpdoc_no_empty_return' => true,
'phpdoc_no_useless_inheritdoc' => true,
'phpdoc_order' => true,
'phpdoc_scalar' => true,
'phpdoc_single_line_var_spacing' => true,
'phpdoc_to_comment' => false,
'phpdoc_trim' => true,
'phpdoc_trim_consecutive_blank_line_separation' => true,
'phpdoc_var_without_name' => true,
'protected_to_private' => true,
'return_assignment' => false,
'return_type_declaration' => ['space_before' => 'one'],
'semicolon_after_instruction' => true,
'short_scalar_cast' => true,
'simplified_null_return' => true,
'single_quote' => true,
'single_line_comment_style' => true,
'ternary_operator_spaces' => true,
'ternary_to_null_coalescing' => true,
'type_declaration_spaces' => true,
'trim_array_spaces' => true,
'unary_operator_spaces' => true,
'whitespace_after_comma_in_array' => true,
])->setFinder($finder);
return $config;
8 changes: 8 additions & 0 deletions .phplint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
path: ./
jobs: 10
cache-dir: runtime/.phplint.cache/
extensions:
- php
exclude:
- vendor/
- runtime/
2 changes: 2 additions & 0 deletions composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@
"require-dev": {
"friendsofphp/php-cs-fixer": "^3.73",
"phpbench/phpbench": "^1.0",
"overtrue/phplint": "^9.6.2",
"phpstan/extension-installer": "^1.0",
"phpstan/phpstan": "^2.0",
"phpstan/phpstan-phpunit": "^2.0",
Expand Down Expand Up @@ -89,6 +90,7 @@
"@putenv PHP_CS_FIXER_IGNORE_ENV=1",
"php-cs-fixer fix --config=.php-cs-fixer.dist.php"
],
"phplint": "phplint",
"test": "phpunit"
},
"config": {
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/elu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/gelu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/hard-sigmoid.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/hard-silu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/relu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/relu6.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/selu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/sigmoid.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/silu.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/softmax.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/activation-functions/softplus.png
Binary file added docs/images/activation-functions/softsign.png
20 changes: 16 additions & 4 deletions docs/neural-network/activation-functions/elu.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,35 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/ELU.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/ELU/ELU.php">[source]</a></span>

# ELU
*Exponential Linear Units* are a type of rectifier that soften the transition from non-activated to activated using the exponential function. As such, ELU produces smoother gradients than the piecewise linear [ReLU](relu.md) function.

$$
{\displaystyle ELU = {\begin{cases}\alpha \left(e^{x}-1\right)&{\text{if }}x\leq 0\\x&{\text{if }}x>0\end{cases}}}
\text{ELU}(x) =
\begin{cases}
\alpha \left(e^{x}-1\right) & \text{if } x \leq 0 \\
x & \text{if } x > 0
\end{cases}
$$

## Parameters
| # | Name | Default | Type | Description |
|---|---|---|---|---|
| 1 | alpha | 1.0 | float | The value at which leakage will begin to saturate. Ex. alpha = 1.0 means that the output will never be less than -1.0 when inactivated. |

## Size and Performance
ELU is a simple function and is well-suited for deployment on resource-constrained devices or when working with large neural networks.

## Plots
<img src="../../images/activation-functions/elu.png" alt="ELU Function" width="500" height="auto">

<img src="../../images/activation-functions/elu-derivative.png" alt="ELU Derivative" width="500" height="auto">

## Example
```php
use Rubix\ML\NeuralNet\ActivationFunctions\ELU;
use Rubix\ML\NeuralNet\ActivationFunctions\ELU\ELU;

$activationFunction = new ELU(2.5);
```

## References
[^1]: D. A. Clevert et al. (2016). Fast and Accurate Deep Network Learning by Exponential Linear Units.
[1]: D. A. Clevert et al. (2016). Fast and Accurate Deep Network Learning by Exponential Linear Units.
16 changes: 14 additions & 2 deletions docs/neural-network/activation-functions/gelu.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,26 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/GELU.php">[source]</a></span>
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/GELU/GELU.php">[source]</a></span>

# GELU
Gaussian Error Linear Units (GELUs) are rectifiers that are gated by the magnitude of their input rather than the sign of their input as with ReLU variants. Their output can be interpreted as the expected value of a neuron with random dropout regularization applied.

$$
\text{GELU}(x) = 0.5 \cdot x \left(1 + \tanh\left(\sqrt{\frac{2}{\pi}} \left(x + 0.044715 \cdot x^3\right)\right)\right)
$$

## Parameters
This activation function does not have any parameters.

## Size and Performance
GELU is computationally more expensive than simpler activation functions like ReLU due to its use of hyperbolic tangent and exponential calculations. The implementation uses an approximation formula to improve performance, but it still requires more computational resources. Despite this cost, GELU has gained popularity in transformer architectures and other deep learning models due to its favorable properties for training deep networks.

## Plots
<img src="../../images/activation-functions/gelu.png" alt="GELU Function" width="500" height="auto">

<img src="../../images/activation-functions/gelu-derivative.png" alt="GELU Derivative" width="500" height="auto">

## Example
```php
use Rubix\ML\NeuralNet\ActivationFunctions\GELU;
use Rubix\ML\NeuralNet\ActivationFunctions\GELU\GELU;

$activationFunction = new GELU();
```
Expand Down
29 changes: 29 additions & 0 deletions docs/neural-network/activation-functions/hard-sigmoid.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
<span style="float:right;"><a href="https://github.com/RubixML/ML/blob/master/src/NeuralNet/ActivationFunctions/HardSigmoid/HardSigmoid.php">[source]</a></span>

# Hard Sigmoid
A piecewise linear approximation of the sigmoid function that is computationally more efficient. The Hard Sigmoid function has an output value between 0 and 1, making it useful for binary classification problems.

$$
\text{HardSigmoid}(x) = \max\left(0,\min\left(1, 0.2x + 0.5\right)\right)
$$

## Parameters
This activation function does not have any parameters.

## Size and Performance
Hard Sigmoid has a minimal memory footprint compared to the standard Sigmoid function, as it uses simple arithmetic operations (multiplication, addition) and comparisons instead of expensive exponential calculations. This makes it particularly well-suited for mobile and embedded applications or when computational resources are limited.

## Plots
<img src="../../images/activation-functions/hard-sigmoid.png" alt="Hard Sigmoid Function" width="500" height="auto">

<img src="../../images/activation-functions/hard-sigmoid-derivative.png" alt="Hard Sigmoid Derivative" width="500" height="auto">

## Example
```php
use Rubix\ML\NeuralNet\ActivationFunctions\HardSigmoid\HardSigmoid;

$activationFunction = new HardSigmoid();
```

## References
[1]: https://en.wikipedia.org/wiki/Hard_sigmoid
Loading
Loading