Joe Suzuki, Osaka University, Japan

Conditional Mutual Information Estimation and its application to Conditional Independence Detection

In this paper, we consider conditional mutual information (CMI) estimation of continuous variables that satisfies consistency and correctly detects conditional independence as the sample size $n$ grows. We divide a three dimensional space into smaller ones and estimate the CMI for each quantization, and choose the maximum value as the final estimation. The estimation does not exceed the correct CMI value with probability one, takes value zero with probability one when the two variable sets are conditionally independent given the other, and works even when some variables are discrete. The estimation can be applied to Bayesian network structure learninng with discrete and continuous variables.