'Sample from multivariate normal/Gaussian distribution in C++

I've been hunting for a convenient way to sample from a multivariate normal distribution. Does anyone know of a readily available code snippet to do that? For matrices/vectors, I'd prefer to use Boost or Eigen or another phenomenal library I'm not familiar with, but I could use GSL in a pinch. I'd also like it if the method accepted nonnegative-definite covariance matrices rather than requiring positive-definite (e.g., as with the Cholesky decomposition). This exists in MATLAB, NumPy, and others, but I've had a hard time finding a ready-made C/C++ solution.

If I have to implement it myself, I'll grumble but that's fine. If I do that, Wikipedia makes it sound like I should

  1. generate n 0-mean, unit-variance, independent normal samples (boost will do this)
  2. find the eigen-decomposition of the covariance matrix
  3. scale each of the n samples by the square-root of the corresponding eigenvalue
  4. rotate the vector of samples by pre-multiplying the scaled vector by the matrix of orthonormal eigenvectors found by the decomposition

I would like this to work quickly. Does someone have an intuition for when it would be worthwhile to check to see if the covariance matrix is positive, and if so, use Cholesky instead?



Solution 1:[1]

Here is a class to generate multivariate normal random variables in Eigen which uses C++11 random number generation and avoids the Eigen::internal stuff by using Eigen::MatrixBase::unaryExpr():

struct normal_random_variable
{
    normal_random_variable(Eigen::MatrixXd const& covar)
        : normal_random_variable(Eigen::VectorXd::Zero(covar.rows()), covar)
    {}

    normal_random_variable(Eigen::VectorXd const& mean, Eigen::MatrixXd const& covar)
        : mean(mean)
    {
        Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> eigenSolver(covar);
        transform = eigenSolver.eigenvectors() * eigenSolver.eigenvalues().cwiseSqrt().asDiagonal();
    }

    Eigen::VectorXd mean;
    Eigen::MatrixXd transform;

    Eigen::VectorXd operator()() const
    {
        static std::mt19937 gen{ std::random_device{}() };
        static std::normal_distribution<> dist;

        return mean + transform * Eigen::VectorXd{ mean.size() }.unaryExpr([&](auto x) { return dist(gen); });
    }
};

It can be used as

int size = 2;
Eigen::MatrixXd covar(size,size);
covar << 1, .5,
        .5, 1;

normal_random_variable sample { covar };

std::cout << sample() << std::endl;
std::cout << sample() << std::endl;

Solution 2:[2]

For a ready-made solution, the armadillo C++ library supports sampling from a multivariate Gaussian distribution (even from positive semi-definite covariance matrices) with the function mvnrnd().

Solution 3:[3]

What about doing an SVD and then checking if the matrix is PD? Note that this does not require you to compute the Cholskey factorization. Although, I think SVD is slower than Cholskey, but they must both be cubic in number of flops.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 user17415130
Solution 3 Anil CR