Skip to content

Feature: TGCN should support non linear_activations #596

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Apr 28, 2025

Conversation

juanmigutierrez
Copy link
Contributor

@juanmigutierrez juanmigutierrez commented Mar 28, 2025

Hi again @CarloLucibello

Changes in this PR – Solving Issue #591

  • Implemented custom linear activations (gate_activation) in the temporal graph layer cells for GraphNeuralNetworks, with corresponding tests.
  • Implemented custom linear activations (gate_activation) in the temporal graph layer cells for GNNLux, with corresponding tests.

Notes & Open Questions

  • Currently, only the sigmoid gate activation functions have been modified based on the referenced paper.

  • Should we also modify the tanh activation?

    • In Flux, this change is straightforward.
    • However, for Lux, we might need an alternative to Lux.GruCell, as it uses tanh by default (as far as I understand) and may not be easily modifiable.
    • Let me know if the tests cover the necessary cases effectively.
  • I'll have the proposal between Tuesday and Wednesday. If you could review it as recommended for the GSoC application Julia Guidelines , I’d really appreciate it your help.

Let me know your thoughts! 🚀 It’s working on my end—please confirm if it runs for you.

@juanmigutierrez
Copy link
Contributor Author

juanmigutierrez commented Mar 29, 2025

I Believe the test fails are not related to my changes, but to test of GNNGraph class itself as far I am concerned

@CarloLucibello
Copy link
Member

I'll have the proposal between Tuesday and Wednesday. If you could review it as recommended for the GSoC application Julia Guidelines , I’d really appreciate it your help.

send it on slack or by email

@juanmigutierrez
Copy link
Contributor Author

juanmigutierrez commented Mar 30, 2025

I believe I've already made the necessary changes. Let me know if everything looks good.

A couple of points I noticed:

  1. The default activation function differs between Lux and GraphNeuralNetworks (Flux) GNNConv. Lux uses sigmoid, while Flux uses ReLU.
  2. It seems that GNNLux is implemented with a single GCN, whereas the Flux version uses a 2-GCN setup, as described in the original paper. Let me know if I misinterpreted the code. I think we are modifying for reference and understanding of myself this function
    image
  3. Fail test of doctest because generate gnngraph contain isolated nodes in a method I did not touch (as far I know) GConvGRUCell, can solve this issue small issue in the following days if the previous steps are ok.

@CarloLucibello
Copy link
Member

The fact that the lux and flux implementations are not in sync is issue #562, the Lux ones need to be updated (not this PR of course)

@juanmigutierrez
Copy link
Contributor Author

The fact that the lux and flux implementations are not in sync is issue #562, the Lux ones need to be updated (not this PR of course)

Perfect I will try to attack this after this issue is solved

@juanmigutierrez
Copy link
Contributor Author

I already implemented your recommendations, test passed correctly

@juanmigutierrez
Copy link
Contributor Author

Let me know if its okay and will start attacking other issues

@CarloLucibello CarloLucibello merged commit b85f098 into JuliaGraphs:master Apr 28, 2025
6 of 7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants