Abstract:
Recently, Bayesian network(BN) plays a vital role in knowledge representation and probabilistic inference. BN structure learning is crucial to research on BN inference. However, there are some disadvantages in the most two-stage hybrid learning method of BN structure: it is easy to lose edges with weak relationship in the first stage, when we learn the super-structure; hill climbing search method is easily plunged into local optimum in the second stage. To avoid the two disadvantages, the super-structure of BN is firstly learned by Opt01ss algorithm, which makes the result miss few edges as much as possible. Secondly, based on super-structure, three search operators are given to analyze the random selection rule of the initial network and address the random optimization strategy for the initial network. Further, SSRandom learning algorithm of BN structure is proposed. The algorithm is a good way to jump out of local optimum extremum to a certain extent. Finally, the learning performance of the proposed SSRandom algorithm is verified by the experiments on the standard Survey, Asia and Sachs networks, by comparing with other three hybrid algorithms according to four evaluation indexs, such as the sensitivity, specificity, Euclidean distance and the percentage of overall accuracy.