Skip to content

关于DFT变换维度的问题 #3

@sunxiaoyao-git

Description

@sunxiaoyao-git

作者,你好!
在进行DFT转换时,我的疑问是维度torch.fft.rfft(queries, dim=-1),在dimension上进行DFT,和在seq length上,FEDformer的做法 进行DFT有什么区别吗
`class VarCorAttention(nn.Module):
def init(self, args, mask_flag=True, factor=5, scale=None, attention_dropout=0.1, output_attention=False) -> None:
super(VarCorAttention, self).init()

    self.scale = scale
    self.mask_flag = mask_flag
    self.output_attention = output_attention
    self.dropout = nn.Dropout(attention_dropout)

def origin_compute_cross_cor(self, queries, keys):
    q_fft = torch.fft.rfft(queries, dim=-1)
    k_fft = torch.fft.rfft(keys, dim=-1)

    res = q_fft*k_fft  # 1 x 10 x 257
    corr = torch.fft.irfft(res, dim=-1)   # 1 x 10 x 512
    corr = corr.mean(dim=-1)  # 1 x 10
    return corr`

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions