ISSN 1000-1239 CN 11-1777/TP

Table of Content

15 August 2005, Volume 42 Issue 8
A New Object Oriented Probabilistic Graphic Model
Wang Ronggui, Gao Jun, Zhang Yousheng, and Peng Qingsong
2005, 42(8):  1283-1292. 
Asbtract ( 489 )   HTML ( 0)   PDF (715KB) ( 660 )  
Related Articles | Metrics
In this paper, a new object oriented probabilistic graphical model, named OPM, and its inference algorithm are proposed to solve the problems of knowledge expression and inference in large Bayesian networks. Firstly, the Bayesian network is segmented into several modules by the name of classes and a kind of object model is used to generate the OPM. OPM can make full use of the conditional independence in the hierarchical structure, which can reduce the complexity of the model construction and knowledge expression effectively. Secondly, an OPM based inference algorithm is proposed via the generalization of the elimination variable inference algorithm to realize the inference mechanism of the OPM. And the parameters in the algorithm can be adjusted according to specific problem to control the computation complexity of the inference process efficiently. And finally, the OPM is used in the automatic detection and location of texts in images to verify its validity. Experimental results show that OPM has not only a ood result, but also a fast detection speed.
Believability based Iterated Belief Revision
Yang Pei, Gao Yang, and Chen Zhaoqian
2005, 42(8):  1293-1298. 
Asbtract ( 530 )   HTML ( 0)   PDF (376KB) ( 545 )  
Related Articles | Metrics
The theory of belief revision describes how the beliefs of an agent should change upon receiving the new information. Classical iterated belief revision methods mainly focus on the consistency of belief change, with little concern of the impact of the uncertain information in multi-agent system and the process of revision. In this paper, an approach of believability based iterated belief revision is presented. This approach relates the belief revision in the multi-agent system to the believability of information, which plays an important role in the revision process. Based on the Dempster-Shafer theory of evidence and believability function formalism, the believability of information can be obtained, and thus the maximal consistent subset with the biggest believability is chosen to compose the revised belief set. The revised belief set by believability based iterated belief revision is dependent on the history of revision, namely, on the information received prior to the current belief set.
A Sentient Particle Swarm Optimization
Chen Hongzhou, Gu Guochang, and Kang Wangxing
2005, 42(8):  1299-1305. 
Asbtract ( 442 )   HTML ( 1)   PDF (430KB) ( 422 )  
Related Articles | Metrics
A new particle swarm optimization characterized by sensation is presented to improve the limited capability of dissipative particle swarm optimization in exploiting history experience. It guides individuals to behave reasonably with the capability of self-adaptation in activities of self-cognition and sociality via quantifying the sensation intension as a result of environmental stimulation according to the sensation model. Considering the complexity of a swarm intelligent system at the level of sensation brings about optimization of the comprehensive capability of global, local searching and cooperating with each other. The testing of three uni?multi-modal benchmark functions commonly used in the evolutionary computation proves the superiority and efficiency of the simple algorithm that can be easily implemented. Finally, the algorithm's parameters are also analyzed and discussed.
A Principal Curve-Based Outlier Detection Model and Its Application in Stock Market
Qi Hongwei, Zhang Junping, and Wang Jue
2005, 42(8):  1306-1312. 
Asbtract ( 559 )   HTML ( 1)   PDF (399KB) ( 754 )  
Related Articles | Metrics
To solve the outlier detection problems where outliers highly intermix with normal data, a general variance-based outlier detection model (VODM) is presented, in which the information of data is decomposed into normal and abnormal components according to their variances. With minimal loss of normal information in the model, outliers are viewed as the top k samples holding maximal abnormal information in a dataset. The VODM is a theoretical framework, and then, the principal curve is introduced as an algorithm of it. Experiments carried out on abnormal returns detection in stock market show that the VODM is feasible.
Decision Tree Based Neural Network Design
Li Aijun, Luo Siwei, Huang Hua, and Liu Yunhui
2005, 42(8):  1312-1317. 
Asbtract ( 1107 )   HTML ( 6)   PDF (318KB) ( 1526 )  
Related Articles | Metrics
The structure determination and parameter initialization of an artificial neural network (ANN) is of great importance to its performance. Generally a network is built by trial and error methods with its parameters randomly initialized. Such practice results in low training efficiency and instability of a network. Based on the functional similarity and equivalence of decision tree and ANN, a new network construction and initialization method is proposed, which is called decision tree based neural network (DTBNN for short). The method is mainly based on the entropy net, but has several improvements. Firstly, an initial network structure is determined according to the information of a decision tree. The structure is then optimized by adding neurons and connections in accordance with practical requirements. Secondly, the network is then initialized so that the hyper-plane it represents is much closer to its final version. By these means, the limitations of entropy net are avoided. Network built by this method has shown good performance. Theoretical analysis and experimental results have shown its effectiveness.
Stability Analysis of Multiobjective Decision Functions Based on GP
Wang Sichun, Zhang Taishan, Yin Zhiyun, and Zhang Chuwen
2005, 42(8):  1318-1323. 
Asbtract ( 486 )   HTML ( 0)   PDF (376KB) ( 460 )  
Related Articles | Metrics
In multiobjective decision proplems many values must be assigned, such as the importance of the different criteria and the values of the alternatives with respect to subjective criteria. Since these assignments are approximate, it is very important to analyze the sensitivity of results when small modifications of the assignments are made. When solving a multicriteria decision problem, it is desirable to choose a decision function that leads to a solution as stable as possible. Proposed here is a method based on genetic programming that produces better decision functions than the commonly used ones. The theoretical expectations are validated by case studies.
A Virtual-Real Comparision Method Used for Sport Simulation and Analysis
Qiu Xianjie, Wang Zhaoqi, Xia Shihong, and Wu Yongdong,
2005, 42(8):  1324-1330. 
Asbtract ( 522 )   HTML ( 0)   PDF (519KB) ( 604 )  
Related Articles | Metrics
The simulation and analysis of human motion is a popular issue in simulation area and it is highly valued in sport motion training and analysis. Video is the main record form of athlete train. It is significant if we put the athlete motion of video and the normal motion of 3D virtual athlete of simulation system into the same screen because it is easy and accurate to compare the difference between them. And it will greatly benefit the athletes so that they can compare their motions with the normal motion of 3D virtual athlete and improve their competition level. While the motion of 3D virtual athlete can be observed from different viewpoints, the appearance will be different. So it is crucial to ensure that the viewpoint of virtual camera in simulation system and that in the video are the same. To solve the problem, a new concept of “virtual-real” comparison is put forward. Virtual here refers to the simulated motion and real is the sport video. The virtual-real comparison technique is to get the camera information from video and to adjust the view point of the virtual camera of 3D simulation system automatically. This technique is implemented in a trampoline sport simulation system and sound results are obtained. The use of virtual-real comparison technique extends the function of the sport simulation system and has greatly benefited the athlete in training.
Overview of the Applications of Curvelet Transform in Image Processing
Long Gang, Xiao Lei, and Chen Xuequan
2005, 42(8):  1331-1337. 
Asbtract ( 1130 )   HTML ( 5)   PDF (400KB) ( 1052 )  
Related Articles | Metrics
The Curvelet transform has received more and more attention in recent years due to its unique characteristics. This transform was developed from the wavelet transform and it has overcome some inherent limitations of wavelet in representing directions of edges in image. The applications of Curvelet transform reveal its great potential in image processing. In this paper, the theory and implementation of Curvelet transform is summarized, its representative applications are introduced in view of their corresponding effects and characteristics compared with other prevailing techniques. Finally, the prospect of its applications in image processing is discussed.
Analysis and Extraction of Structural Features of Off-Line Handwritten Digits Based on Principal Curves
Zhang Hongyun, Miao Duoqian, and Zhang Dongxing
2005, 42(8):  1344-1349. 
Asbtract ( 429 )   HTML ( 5)   PDF (355KB) ( 570 )  
Related Articles | Metrics
Extraction and choice of features are critical to improving the recognition rate of off-line handwritten digits. Principal curves are nonlinear generalizations of principal components analysis. They are smooth self-consistent curves that pass through the “middle” of the distribution. They preferably reflect the structural features of the data. During digit feature selection, firstly principal curves are used to extract the structural features of training data; Secondly the classification features used for digits coarse classification and precise classification are chosen by analyzing the structural features of principal curves in detail; Finally coarse classification and precise classification are separately carried out in handwritten digits recognition. The Concordia University CENPARMI handwritten digit database is used in the experiment. The result of the experiment shows that these features have good discriminating power of similar digits. The proposed method can effectively improve the recognition rate of off-line handwritten digits and provide a new approach to the research for off-line handwritten digits recognition.
Progressive Geometry Compression for Interactive Browsing
Liu Bo and Zhang Hongbin
2005, 42(8):  1345-1349. 
Asbtract ( 495 )   HTML ( 0)   PDF (279KB) ( 556 )  
Related Articles | Metrics
Because of self-occlusion, browsing 3D objects is a typical random access problem. After compression, if only the data of visible regions under the current viewing parameter is transmitted and decoded, the network bandwidth and decoding resources can be saved. Almost all state-of-the-art mesh compression schemes do not take random access into account. A progressive geometry compression scheme based on wavelet transform, which supports interactive browsing, is proposed. The main idea is to partition the mesh into many blocks and encode the details of each block independently, and then transmit only the encoded details of visible blocks. The wavelet coefficients which represent mesh details are grouped into zerotrees, and each zerotree is compressed independently using a modified SPIHT algorithm. The experimental results indicate that the proposed scheme has achieved comparable compression efficiency with PGC, and the amount of data required to reconstruct only visible regions is about 60% of the complete bitsteam.
A Scheduling Algorithm for Long Duration Transaction Based on Strong Orderability Criterion
Wang Jinling, Jin Beihong, and Li Jing
2005, 42(8):  1355-1361. 
Asbtract ( 380 )   HTML ( 0)   PDF (371KB) ( 450 )  
Related Articles | Metrics
Existing concurrency control mechanisms for long duration transactions need human's participation when solving concurrent conflicts or performing rollback, and the cost for rolling back a long duration transaction is high. In this paper, a new correctness criterion—The strong orderability criterion for the concurrency control of long duration transactions is defined, and a scheduling algorithm based on the criterion is proposed. The algorithm makes use of the semantic knowledge of transactions to provide a higher degree of concurrency for long duration transactions. At the same time, the recovery mechanism of long duration transactions is greatly simplified, and the cost for rollback is decreased. Simulation results show that the algorithm has sound concurrency management ability and recovery efficiency.
An Improved Dimension Hierarchy Aggregate Cube Storage Structure for Data Warehouses
Liang Zuopeng, Hu Kongfa, Dong Yisheng, and Chen Ling
2005, 42(8):  1362-1368. 
Asbtract ( 441 )   HTML ( 0)   PDF (497KB) ( 467 )  
Related Articles | Metrics
In this paper it is proposed to create high performance DHAC (dimension hierarchy aggregate cube) with the dimension hierarchy aggregate technique on the cube. By using the dimension hierarchy of the dimension hierarchy aggregate tree, the DHAC can optimize the query efficiency and the update efficiency, and the cube semantic operation such as roll up and drill down can also be supported. The DHAC can incrementally update all the affected ancestor notes while updating the data cell by insertion and deletion in it. The DHAC can also incrementally update without being recreated while being added new dimension data in it. As a result, this algorithm can greatly reduce the update time. The algorithms of DHAC are compared with all the existing ones such as SDDC(space-efficient dynamic data cube) by more experiment. The analytical and experimental results show that the algorithms of DHAC proposed are more efficient than other existing ones.
A Service-Oriented Workflow Access Control Model
Xu Wei, Wei Jun, and Li Jing
2005, 42(8):  1369-1375. 
Asbtract ( 352 )   HTML ( 0)   PDF (384KB) ( 550 )  
Related Articles | Metrics
With the progress of enterprise globalization and the development of combination and differentiation in enterprise business, organizations become more dynamic, and business processes are frequently changing. As a result, workflow access control turns more complicated. To solve this problem, in view of decoupling the workflow access control model from the process model, a service-oriented workflow access control (SOWAC) model is presented. In the SOWAC model, service is the abstraction of a task and the unit for applying access control. Therefore, access control of tasks is replaced with access control on services. The elements of the SOWAC model are described and the enforcement of SOWAC is illustrated by an example workflow. Then the dynamic separation of duty for the SOWAC model is proposed based on the authorization history of services. By applying the SOWAC model in a real workflow system, it shows that the SOWAC model is practical and effectual.
A Formal Framework for Agent-Oriented Analysis and Design Based on Grid
Liu Wei and Liu Zongtian
2005, 42(8):  1376-1383. 
Asbtract ( 401 )   HTML ( 0)   PDF (532KB) ( 478 )  
Related Articles | Metrics
An agent-oriented formal modeling framework based on OGSA is proposed, which is called formal AOMG(formal agent-oriented modeling based on grid). Based on Object-Z, formal AOMG integrates meta-models from I\+* and UML, and introduces a new intention attribute (service attribute) to agent. Formal AOMG can formalize relationships between agents and grid environment. Three core models (organization model, agent class model and agent service model) of formal AOMG are presented. Formal AOMG provides a set of formal semantic mapping rules between organization model and agent class model, which facilitates the system model's transition from agent level to object level.
Dynamic Caching Techniques of Media Suffix in Streaming Content Delivery
Cai Qingsong, Li Zimu, Qin Shaohua, and Hu Jianping
2005, 42(8):  1384-1390. 
Asbtract ( 549 )   HTML ( 0)   PDF (435KB) ( 451 )  
Related Articles | Metrics
Built on recently presented optimized batch patching (OBP), two dynamic caching strategies for media suffix named IC-BP and PA-BP are proposed in this paper to alleviate the over-consumption of backbone bandwidth and the server load in media streaming systems. Here derived are the required average backbone bandwidth, the average server channels used and the maximum cache capacity of the respective scheme when dealing with a unique media object. By defining a simple but practical cost function, the aggregate delivery cost of the two schemes that tradeoff the network and server resource are discussed. The results show that both schemes can greatly reduce the backbone bandwidth consumption and the server load, while PA-BP outperforms IC-BP with much lower cost since it saved more patch traffic by using an additional buffer to pre-buffer the incoming data in case requests arrive in the current batch interval and therefore more transmission cost is saved than IC-BP.
Decreasing Failures Correlation in Peer-to-Peer Systems
Zhang Dawei, Han Hua, and Dai Yafei
2005, 42(8):  1391-1396. 
Asbtract ( 332 )   HTML ( 0)   PDF (334KB) ( 414 )  
Related Articles | Metrics
m of n fault tolerance techniques are used to increase availability and reliability in many P2P network storage systems, but in practice correlated failures among servers will lead to low tolerance rate of those techniques. Presented in this paper is an approach to discover sets of server nodes that fail with low correlation. m of n fault tolerance techniques can be increased by using these independently failing server nodes. The approach is discussed in detail and a preliminary evaluation is provided, which shows that the approach is practical and effective.
A Task Allocation and Performance Analysis Based on Finished Time
Qu Shaogang, Yang Guangwen, Lin Chuang, and Shi Shuming
2005, 42(8):  1397-1402. 
Asbtract ( 360 )   HTML ( 0)   PDF (338KB) ( 704 )  
Related Articles | Metrics
Seemingly overnight, the Internet has gone from an academic experiment to a worldwide information matrix. How to improve the response time and throughput of a complicated network system is a challenging problem. Discussed and provided in this paper are quest dispatching and selecting schemes for Web server clusters and the stochastic high level Petrinet models for those schemes. A novel method is proposed to allocate Web request tasks according to the minimums finished time and an analysis is also presented based on the experiments using Petri net. Experiment results show that the method is feasible and effective.
A QoS-Aware Multicast Routing Protocol Based on Centralized and Distributed Algorithms
Huang Dongjun, Chen Songqiao, and Wang Jianxin
2005, 42(8):  1403-1408. 
Asbtract ( 423 )   HTML ( 0)   PDF (376KB) ( 559 )  
Related Articles | Metrics
QoS-aware multicast service is an essential component in the future high speed Internet. QoS-aware multicast routing algorithm and protocol are the core of QoS-based multicast architecture. Proposed in this paper is a multi-path and QoS-aware multicast routing protocol integrated with both a centralized algorithm and a distributed one. A node initially takes, from a special node called the manager, some information about the existing tree the receiver wants to join. Then it uses Dijkstra's minimum cost algorithm with some costs as the parameters to calculate several minimum cost paths from the joining receiver to the source node and the on-tree nodes of the group, and regards them as a set of feasible paths for selection. When a joining receiver gets the set, it employs a heuristic to select the best path to join multicast tree. Analysis and experiments prove that this algorithm can be used to solve the delay and bandwidth-constrained minimum-cost multicast tree problem for it has many advantages such as loopfree, high average call accepted ratio, shorter average call setup time, and better scalability.
RSDictionary—A Global Namespace for Distributed Computing Environment
Zhang Wusheng, Yang Guangwen, Shen Meiming, and Zheng Weimin
2005, 42(8):  1409-1414. 
Asbtract ( 367 )   HTML ( 1)   PDF (360KB) ( 515 )  
Related Articles | Metrics
Resource management plays a very important role in a grid or P2P computing environment. A global name space RSDictionary is presented to organize distributed resources based on the name instead of their hosting information such as the IP address and/or server name, etc. The RSDictionay defines a logical view and a physical view to manage the resources of such distributed environments to satisfy the efficiency requirements. The physical view organizes the resources into a static B-tree-like structure and mimics the organization of a dictionary, while the logical view is constructed at runtime according to the needs of a specific task. With the RSDictionary global name space, the computation resources can be solely deployed and referred by their name, and a local synchronization model is also introduced to minimize the maintenance costs.
An Entropy-Based Method to Measure the Regularity of Normal Behaviors in Anomaly Detection
Pan Feng, Jiang Junjie, and Wang Weinong
2005, 42(8):  1415-1421. 
Asbtract ( 452 )   HTML ( 1)   PDF (394KB) ( 725 )  
Related Articles | Metrics
Anomaly detection is an essential component of the protection mechanisms against novel attacks. In this paper, an entropy-based method to measure the regularity of normal behaviors in anomaly detection is proposed. This measure is defined as the ratio of normal behavior's entropy to totally random behavior's entropy. Two case studies on Unix system call data and network tcpdump data are used to illustrate the utilities of this measure. A new algorithm is advanced to detect network intrusions using sequences of system calls, and it can realize anomaly detection over noisy data. At the same time, a new immune algorithm: multi-level negative selection algorithm is developed and applied to anomaly detection, compared with Forrest's negative selection algorithm. It enhances detector generation efficiency in essence.
Research and Protection of the Digital Evidence Collecting System
Sun Bo and Sun Yufang
2005, 42(8):  1422-1426. 
Asbtract ( 424 )   HTML ( 1)   PDF (250KB) ( 604 )  
Related Articles | Metrics
Research regarding digital forensic technologies has become more active with the recent increases in illegal accesses to computer systems. Digital evidence is easy to modify and erase. In order to collect the evidence with integrity and fidelity, digital evidence collecting system which is set in the target system in advance is proposed to collect digital evidence for purpose. And the security of forensic mechanisms themselves is another serious problem. Based on the analysis of relative researches, secure area is proposed to protect forensic mechanisms from attacking.
A New Chameleon Threshold Signature Based on Bilinear Pairing
Ma Chunbo and He Dake
2005, 42(8):  1427-1430. 
Asbtract ( 403 )   HTML ( 0)   PDF (223KB) ( 622 )  
Related Articles | Metrics
Consider the following situations: there are two companies A and B, and they want to communicate with each other over the Internet. In order to generate a secure signature on behalf of the company A, A requires that only enough members in A can generate a valid signature by cooperation. At the same time, company A wants only the designated receiver who can verify the signature. Motivated by this consideration and the necessity of protecting the security of original information, this paper, which combines the sharing secret technology and Chameleon function, presents a new signature based on bilinear pairing. It is only the designated receiver that can verify the signature and can't convince the third party of the valid signature. It is undeniable and when a dispute occurs, the signer can prove the sig
DNA Computation for a Category of Special Integer Planning Problem
Wang Lei, Lin Yaping, and Li Zhiyong
2005, 42(8):  1431-1437. 
Asbtract ( 392 )   HTML ( 2)   PDF (375KB) ( 441 )  
Related Articles | Metrics
DNA computation based on the theory of biochemical reactions has better performance in solving a class of intractable computational problems, especially the NP-complete problems, than traditional computing methods based on the current silicon computers, so it is of great importance to study the DNA computation. The new concepts such as rank of constraint equation group and three kinds of constraint complement links of constraint equation group are proposed, and according to those concepts and on the basis of the method of fluorescence-labeling in the surface-based approach to DNA computation, a novel algorithm based on DNA computation is designed, which solves the problem of optimal solutions to a category of special integer planning. By using the fluorescence-quenching technique to eliminate false solutions from all the possible solutions to the given integer-planning problem, the new algorithm can identify all of the feasible solutons, and then, can obtain all the optimal solutions to the given integer-planning problem by comparing the target-function's value of those feasible solutions. Analyses show that, the new algorithm has some good characteristics such as simple encoding, low cost and short operating time, etc.
An Algorithm of B\++ Tree Management in P2P Environment
Ju Dapeng, Li Ming, Hu Jinfeng, Wang Dongsheng, Zheng Weimin, and Ma Yongquan
2005, 42(8):  1438-1444. 
Asbtract ( 369 )   HTML ( 0)   PDF (398KB) ( 523 )  
Related Articles | Metrics
Peer-to-peer storage architecture is booming in these years for its immense capacity, self organizing, high scalability, fault tolerance and so on. Distributed query is an indispensable part of it, composed of query on keyword and query on numeric range. There are many efficient algorithms for keyword query in peer-to-peer storage architecture, but few for numeric range query. In this paper, PB-link tree, an efficient indexing algorithm for numeric range query in peer-to-peer storage architecture is presented. PB-link tree has attributes as high availability, low networking overhead, and load balance. The experimental results show that PB-link tree’s networking overhead is 80% lower, and the query efficiency is 6 times higher than the traditional distributed indexing algorithms. It can also guarantee 85% queries’correctness when half of peers fail.
Analysis and Research of a Window-Constrained Real-Time System with Constraints
Zhu Xiangbin and Tu Shiliang
2005, 42(8):  1445-1451. 
Asbtract ( 316 )   HTML ( 0)   PDF (396KB) ( 439 )  
Related Articles | Metrics
Based on the research of window-constrained real-time system with (m,k) constraints, a window-constrained real-time system with constraints is proposed. Firstly, the system is analyzed and some conclusions about the schedulability of this system are obtained. In succession, a new dynamic windows-constrained scheduling, which uses the characteristic of to improve the schedulability of this system, is proposed. To evaluate the performance of the new algorithm, extensive simulation studies are carried out. These simulations apply the dynamic window-constrained scheduling (DWCS) algorithm to schedule the window-constrained real-time system with constraints and use it as a baseline to compare with the proposed algorithm. Simulation results show that the new algorithm is better than the old DWCS algorithm. Finally, the future research directions are discussed.
Track Replica—The Strategy for Disk Seeking Optimization in Retrieving Multimedia Data
Liu Jun, Yang Xuejun, Tang Yuhua, and Wang Junwei
2005, 42(8):  1452-1459. 
Asbtract ( 460 )   HTML ( 0)   PDF (451KB) ( 547 )  
Related Articles | Metrics
Because of mechanical operations such as head seeking and rotating, disk device based storage is a main bottleneck of a multimedia server. People can make full use of disk rotation time with track-aligned request service. So disk seeking is the main waste in disk I/O. Since multimedia data is almost read-only, the density of magnetic media is increased at speed of 60% per year. In this paper a new strategy to tradeoff between space and disk seeking is provided: cylinder replication is provided. The main work of the paper is: first, present the placement of n-way cylinder replica in single disk and n-d-way cylinder replica in disk array; then the expressions of mathematic expectation of mean seeking distance of the two cylinder replica placement strategies are derived; finally, lots of simulations are carried out. The simulation results not only show that the expressions are correct, but also tell us that cylinder replica strategy works more better than the traditional tradeoffs such as n-way striping and D-way disk mirror.