Speaker
Description
Conditional independence is a ternary relation from probability theory between subcollections of jointly distributed random variables. Among normally distributed (i.e., "Gaussian") variables, these relations are characterized by the vanishing of specific subdeterminants of the distribution's positive definite covariance matrix.
The combinatorial structures realizable as conditional independence relations of positive definite matrices can be studied algebraically and this theory is in many ways similar to that of (oriented) matroids in synthetic geometry. However, points in a vector space are replaced by random variables and linear independence by stochastic independence.
This talk gives an overview of the Gaussian conditional independence inference problem with an emphasis on the parallels to matroid theory, including the method of final polynomials and universality theorems. Example computations underline the significance of the positive definiteness restriction for the combinatorial structure.