We consider the problem of learning a directed acyclic graph (DAG) model based on conditional independence testing. The most widely used approaches to this problem are variants of the PC algorithm. One of the drawbacks of the PC algorithm is that it requires the strong-faithfulness assumption, which has been show to be restrictive especially for graphs with cycles in the skeleton. In this paper, we propose an alternative method based on finding the permutation of the variables that yields the sparsest DAG. We prove that the constraints required for our sparsest permutation (SP) algorithm are strictly weaker than faithfulness and are necessary for any causal inference algorithm based on conditional independence testing. Through specific examples we show that the SP algorithm has better performance than the PC algorithm. In the Gaussian setting, we prove that our algorithm boils down to finding the permutation of the variables with sparsest Cholesky decomposition for the inverse covariance matrix. Using this connection, we show that in the oracle setting, where the true covariance matrix is known, the SP algorithm is in fact equivalent to $\ell_0$-penalized maximum likelihood estimation.
This talk is based on joint work with Caroline Uhler (IST Austria).
September 11 @ 12:30
12:30 pm (1h)
Discovery Building, Orchard View Room