Skip to content

why not use built-in "torch.nn.functional.conv2d" for ska? #8

@DLeeeeeee

Description

@DLeeeeeee

Many thanks for your excellent work and for sharing it with the community.

I noticed that the SKA module includes custom forward and backward functions. Please correct me if I’m mistaken, but SKA seems to be essentially a convolution with dynamic kernel weights. I’m curious—what’s the reason for implementing it manually instead of using PyTorch’s built-in Conv2d? Using the built-in function could potentially simplify deployment.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions