'Kullback-leibler distance between two distributions in pytorch
I should calculate the Kullback leibler distance between two distribution in pytorch. Two distributions is a with standard deviation log_var_r, log_var_q and mean of mu_r and mu_q. All of my inputs are torch.Size([500, 7]). I am using the following code to calculate it, however I am not sure that this formulation is true or not. Could you please help me with that? There is exist any other methods to calculate kl in torch? Thanks
def KL(mu_r,log_var_r,mu_q,log_var_q):
sigma_q = torch.exp(0.5 * (log_var_q))
sigma_r = torch.exp(0.5 * (log_var_r))
t1 = torch.square(sigma_q/sigma_r)
t2 = torch.log(torch.square(sigma_r/sigma_q))
t3 = torch.square(mu_r - mu_q)/torch.square(sigma_r)
kl_loss = 0.5*torch.sum(t1 + t2 + t3, dim=1) - 0.5*t1.size(1)
return kl_loss
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
