next up previous contents
Next: Local score based structure Up: Introduction Previous: Inference algorithm   Contents

Learning algorithms

The dual nature of a Bayesian network makes learning a Bayesian network as a two stage process a natural division: first learn a network structure, then learn the probability tables.

There are various approaches to structure learning and in Weka, the following areas are distinguished:

For each of these areas, different search algorithms are implemented in Weka, such as hill climbing, simulated annealing and tabu search.

Once a good network structure is identified, the conditional probability tables for each of the variables can be estimated.

You can select a Bayes net classifier by clicking the classifier 'Choose' button in the Weka explorer, experimenter or knowledge flow and find BayesNet under the weka.classifiers.bayes package (see below).

\epsfig{file=images/select.eps,height=7cm}

The Bayes net classifier has the following options:

\epsfig{file=images/bayesnet.options.eps,height=7cm}

The BIFFile option can be used to specify a Bayes network stored in file in BIF format. When the toString() method is called after learning the Bayes network, extra statistics (like extra and missing arcs) are printed comparing the network learned with the one on file.

The searchAlgorithm option can be used to select a structure learning algorithm and specify its options.

The estimator option can be used to select the method for estimating the conditional probability distributions (Section 6).

When setting the useADTree option to true, counts are calculated using the ADTree algorithm of Moore [10]. Since I have not noticed a lot of improvement for small data sets, it is set off by default. Note that this ADTree algorithm is different from the ADTree classifier algorithm from weka.classifiers.tree.ADTree.

The debug option has no effect.


next up previous contents
Next: Local score based structure Up: Introduction Previous: Inference algorithm   Contents
Remco Bouckaert 2008-05-12