Skip to content

Instantly share code, notes, and snippets.

@sergeyprokudin
Last active October 28, 2025 12:31
Show Gist options
  • Select an option

  • Save sergeyprokudin/4a50bf9b75e0559c1fcd2cae860b879e to your computer and use it in GitHub Desktop.

Select an option

Save sergeyprokudin/4a50bf9b75e0559c1fcd2cae860b879e to your computer and use it in GitHub Desktop.
Multivariate Gaussian Negative LogLikelihood Loss Keras
import keras.backend as K
import numpy as np
def gaussian_nll(ytrue, ypreds):
"""Keras implmementation of multivariate Gaussian negative loglikelihood loss function.
This implementation implies diagonal covariance matrix.
Parameters
----------
ytrue: tf.tensor of shape [n_samples, n_dims]
ground truth values
ypreds: tf.tensor of shape [n_samples, n_dims*2]
predicted mu and logsigma values (e.g. by your neural network)
Returns
-------
neg_log_likelihood: float
negative loglikelihood averaged over samples
This loss can then be used as a target loss for any keras model, e.g.:
model.compile(loss=gaussian_nll, optimizer='Adam')
"""
n_dims = int(int(ypreds.shape[1])/2)
mu = ypreds[:, 0:n_dims]
logsigma = ypreds[:, n_dims:]
mse = -0.5*K.sum(K.square((ytrue-mu)/K.exp(logsigma)),axis=1)
sigma_trace = -K.sum(logsigma, axis=1)
log2pi = -0.5*n_dims*np.log(2*np.pi)
log_likelihood = mse+sigma_trace+log2pi
return K.mean(-log_likelihood)
@13512525
Copy link

Hello, I'd like to ask if the variance value you get here is the logarithm of the variance directly obtained. Then how do you design your variance prediction network? Do you take the logarithmic variance in the code after the prediction?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment