A. Platas-López, E. Mezura-Montes, N. Cruz-Ramírez, A. Guerra-Hernández. Discriminative learning of bayesian network parameters by differential evolution. Applied Mathematical Modelling 93(2021): 244-256. December 2020. ISSN 0307-904X | ScienceDirect
Abstract: This work proposes Differential Evolution (DE) to train parameters of Bayesian Networks (BN) for optimizing the Conditional Log-Likelihood (Discriminative Learning) instead of the log-likelihood (Generative Learning). Any given BN structure encodes assumptions about conditional independencies among the attributes and will result in error if they do not hold in the data. Such an error includes the classification dimension. The main goal of Discriminative learning is minimize this type of error. In this sense, although BN Discriminative Parameter Learning algorithms have been proposed, to the best of the authors’ knowledge, a meta-heuristic approach has not been devised yet. Thus, this is our main contribution: to come up with this kind of solution and evaluate its behavior so that its feasibility in this domain can be determined. Regarding the proposed method, the bias provided by the best solution in the population improves DE’s performance. DE variants based on JADE, such as L-SHADE and C-JADE, are especially recommended when introducing adaptation mechanisms of mutation and crossover parameters thus reducing the dependence on their calibration. L-SHADE is computationally recommended over any other DE variant. DE approach works well in essentially every standard situation, so DE approach is robust and at least as good, and often better, than the state-of-art method for Discriminative Learning called WANBIA. Our results suggest a potential benefit for Discriminative parameter learning with strong independence assumptions among attributes and that it typically produces more accurate classifiers than generative learning.