next up previous contents
Next: Running from the command Up: Bayesian Network Classifiers in Previous: Fixed structure 'learning'   Contents


Distribution learning

Once the network structure is learned, you can choose how to learn the probability tables selecting a class in the weka.classifiers.bayes.net.estimate package.

\epsfig{file=images/estimate.algorithms.eps,height=8cm}

The SimpleEstimator class produces direct estimates of the conditional probabilities, that is,

\begin{displaymath}P(x_i=k\vert pa(x_i)=j)=\frac{N_{ijk}+N_{ijk}'}{N_{ij}+N_{ij}'}\end{displaymath}

where $N_{ijk}'$ is the alpha parameter that can be set and is $0.5$ by default. With $alpha=0$, we get maximum likelihood estimates.

\epsfig{file=images/estimate.direct.eps,height=4cm}

With the BMAEstimator, we get estimates for the conditional probability tables based on Bayes model averaging of all network structures that are substructures of the network structure learned [1]. This is achieved by estimating the conditional probability table of a node $x_i$ given its parents $pa(x_i)$ as a weighted average of all conditional probability tables of $x_i$ given subsets of $pa(x_i)$. The weight of a distribution $P(x_i\vert S)$ with $S\subseteq pa(x_i)$ used is proportional to the contribution of network structure $\forall_{y\in S}y\to x_i$ to either the BDe metric or K2 metric depending on the setting of the useK2Prior option (false and true respectively).

\epsfig{file=images/estimate.bma.eps,height=5cm}


next up previous contents
Next: Running from the command Up: Bayesian Network Classifiers in Previous: Fixed structure 'learning'   Contents
Remco Bouckaert 2008-05-12