Skip to content

Convert activation functions to numpower #381

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 115 commits into
base: 3.0
Choose a base branch
from

Conversation

SkibidiProduction
Copy link

@SkibidiProduction SkibidiProduction commented Jul 14, 2025

Activation implementations

  • Swapped out custom Tensor code for NumPower APIs across all functions: ReLU, LeakyReLU, ELU, GELU, HardSigmoid, SiLU, Tanh, Sigmoid, Softmax, Softplus, Softsign, ThresholdedReLU, etc.

  • Updated derivative methods to use numpower’s derivative helpers.

Tests

  • Refactored unit tests to assert against numpower outputs.

  • Adjusted tolerances and assertions to match numpower’s numeric behavior.

Documentation

  • Added/updated images under docs/images/activation-functions/ to illustrate each activation curve and its derivative using the new implementations.

  • Cleaned up corresponding markdown to reference the updated diagrams.

Code cleanup

  • Aligned naming conventions and method signatures with numpower’s API.

  • Minor style fixes (whitespace, imports, visibility).

apphp and others added 30 commits June 25, 2025 19:44
Sam 8  LeakyReLU and refactoring
Sam 12 softmax and softplus functions
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants