Abstract
The graphical model formalism allows to describe multivariate probability distributions using a graph where random
variables are represented by nodes, and the absence of an edge corresponds to conditional independence. While this
formalism is very general, the corresponding maximum-likelihood problem is often challenging numerically. In addition,
one often needs to obtain a graph that is sparse, in order to enhance interpretability of the result. In this talk, we examine
the problem where log-likelihood function is penalized by an l-one norm term to encourage sparsity, in two special
cases,first for Gaussian then for Boolean variables. In the Gaussian case, we discuss first-order methods and a block-
coordinate descent algorithm. For Boolean random variables, the problem is NP-hard, due to an exponential number of
terms in the log-likelihood function. We discuss two approximations, one based on Wainwright and Jordan's log-
determinant approximation (2005), and another based on lifting and rank relaxation.