Skip to content

RuntimeError when using Recurrent blocks #285

@naveedunjum

Description

@naveedunjum

Describe the bug
When I try to use the cuba Recurrent blocks in my network, I get
RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.
I tried this with various networks, and also replaced the Dense layer with the Recurrent layer in the XOR regression or the Oxford Tutorial
Here is the network I am using which is same as the XOR network with recurrent block:

class Network(torch.nn.Module):
    def __init__(self):
        super(Network, self).__init__()

        neuron_params = {
                'threshold'     : 0.1,
                'current_decay' : 1,
                'voltage_decay' : 0.1,
                'requires_grad' : True,     
            }
        
        self.blocks = torch.nn.ModuleList([
                slayer.block.cuba.Dense(neuron_params, 100, 256),
                slayer.block.cuba.Recurrent(neuron_params, 256, 256),
                slayer.block.cuba.Dense(neuron_params, 256, 1),
            ])
    
    def forward(self, spike):
        for block in self.blocks:
            spike = block(spike)
        return spike

To reproduce current behavior
Steps to reproduce the behavior:

  1. Replace the Dense layer with the Recurrent layer in the XOR regression and Oxford Tutorial
  2. I get this error ...
    RuntimeError: Output 0 of SelectBackward0 is a view and is being modified inplace. This view was created inside a custom Function (or because an input was returned as-is) and the autograd logic to handle view+inplace would override the custom backward associated with the custom Function, leading to incorrect gradients. This behavior is forbidden. You can fix this by cloning the output of the custom Function.

Expected behavior
Normally it should work without any problems, because the network is working well with the Dense layers.

Environment (please complete the following information):

  • Device: Mac Air M2
  • OS: MacOS
  • Lava version [e.g. 0.6.1]

Metadata

Metadata

Assignees

Labels

1-bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions