-
Notifications
You must be signed in to change notification settings - Fork 420
Open
Labels
questionFurther information is requestedFurther information is requested
Description
i am trying to understand the loss function :
def calc_loss(self, y_true, y_pred): """ 矩阵计算batch内的cos loss """ y_true = y_true[::2] norms = (y_pred ** 2).sum(axis=1, keepdims=True) ** 0.5 y_pred = y_pred / norms y_pred = torch.sum(y_pred[::2] * y_pred[1::2], dim=1) * 20 y_pred = y_pred[:, None] - y_pred[None, :] y_true = y_true[:, None] < y_true[None, :] y_true = y_true.float() y_pred = y_pred - (1 - y_true) * 1e12 y_pred = y_pred.view(-1) y_pred = torch.cat((torch.tensor([0]).float().to(self.device), y_pred), dim=0) return torch.logsumexp(y_pred, dim=0)
- why we are taking alternate values from true labels?
- why we are taking dot product between alternate ypred?
if possible can you share any link or documentation of paper for this. thanks
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested