Loading...
ISSN 1000-1239 CN 11-1777/TP

Table of Content

15 July 2006, Volume 43 Issue 7
Paper
An Ascending Bid Multi-Attribute Auction Method
Jin Xing and Shi Chunyi
2006, 43(7):  1135-1141. 
Asbtract ( 394 )   HTML ( 0)   PDF (372KB) ( 735 )  
Related Articles | Metrics
Use auction methods to allocate resources among self-interested agents efficiently and reasonably is one of the challenges of multi-agent systems. Multi-attribute auctions extend traditional auction settings to allow negotiation over non-price attributes such as weight, color, size in addition to price. Based on a generalized multi-attribute auction model, an auction method—MAE is provided. MAE is an extension for English auction from single attribute to multi-attribute. Strategies and profits of buyer and sellers in MAE. Some main properties of MAE are proved. Buyers and sellers are individually rational in MAE. Buyers and sellers have nearly optimal strategies. The total profit of buyers and sellers is nearly optimal with the given strategies. Compared with Esther David's works, MAE has a more generalized model. Compared with the MAV auction, MAE is more transparent for sellers. Seller's strategy in MAE is more intuitive than in MAV.
A Winner Determine Algorithm for Combinatorial Auctions with Decreasing Marginal Utilities
Jin Xing and Shi Chunyi
2006, 43(7):  1142-1148. 
Asbtract ( 363 )   HTML ( 0)   PDF (399KB) ( 605 )  
Related Articles | Metrics
Auctions are important mechanisms for resource and task allocation in multi-agent systems (MAS). Auction methods have explicit rules, require less agent abilities and achieve fast and efficient mutual agreements. In most microeconomic theories, consumers are assumed to exhibit decreasing marginal utilities. Aimed at the CAP in combinatorial auction with decreasing marginal utilities, the 1-UNT (1 unit cannot transfer) condition and k-UNT (k unit cannot transfer) condition of allocations are presented. A 1-UNT check based iterative algorithm is developed. The result of 1-UNT check based algorithm is proved with the effective rate of at least 0.5. Experiment study shows that the combination of 1-UNT algorithm and greedy algorithm can achieve better allocation in less time. A k-UNT check based algorithm is also presented proving that even in the simple case of CAP of 2 buyers, the k-UNT check based algorithm can not guarantee higher effective rate of 0.5. The 1-UNT check based algorithm partially improves the work of Benny Lehmann.
Persuasive Multi-Agent Multi-Issue Negotiation
Yang Pei, Gao Yang, and Chen Zhaoqian
2006, 43(7):  1149-1154. 
Asbtract ( 320 )   HTML ( 2)   PDF (313KB) ( 523 )  
Related Articles | Metrics
The negotiation in MAS is usually comprised of multiple issues, which causes the extremely huge problem space. Classical negotiation methods based on game theory obtain the optimal resolution through thorough search of the space, which are consequently not suitable for multi-issue negotiation. In addition, those methods consider little about the change of the agents' preference, so that with the incomplete and incorrect information, the optimal resolution is not rational. A persuasive multi-agent multi-issue negotiation method is illustrated, which is underpinned by belief revision logic. These belief-based negotiation agents are able to persuade their opponents to change their position. Preliminary experiments show that the belief-based negotiation agents can adapt to the changing environment and reach the agreement more quickly and correctly than those based on the classical negotiation model under time pressure.
A Multi-Agent Negotiation Model Based on Acquaintance Coalition and Extended Contract Net Protocol
Tao Haijun, Wang Yadong, Guo Maozu, and Wang Hanlun
2006, 43(7):  1155-1160. 
Asbtract ( 410 )   HTML ( 0)   PDF (332KB) ( 576 )  
Related Articles | Metrics
Contract net protocol is widely used in multi-agent systems because of its highextensibility and flexibility in a dynamic environment. However, many shortages exist while applying the traditional contract net protocol in multi-agent systems, such as rapidly increasing of the negotiation cost along with the system's dimension, no guarantee of the negotiation quality and difficulties in adapting the dynamic environment, etc. A multi-agent negotiation model based on the acquaintance coalition and the extended contract net protocol are presented to improve the efficiency of negotiation. The structure of the multi-agent negotiation model is given to support the extended contract net protocol. The acquaintance coalition policy, the trust degree parameter and the adjustment rule are introduced and used to construct the extended contract net protocol. Finally, through an example and analysis of a missile defense system which uses the model, improvement of the negotiation efficiency is proven.
Study of a Neural Network Ensemble Algorithm for Small Data Sets
Li Kai and Huang Houkuan
2006, 43(7):  1161-1166. 
Asbtract ( 450 )   HTML ( 3)   PDF (362KB) ( 665 )  
Related Articles | Metrics
Ensemble learning has become a hot topic in machine learning. It dramatically improves the generalization performance of a classifier. In this paper, neural network ensemble for small data sets is studied and an approach to neural network ensemble (Novel\-NNE) is presented. For increasing ensemble diversity, a diverse data set is generated as part training set in order to create diverse neural network classifiers. Moreover, different combinational methods are studied for Novel\-NNE. Experimental results show that Novel\-NNE for both the relative majority vote method and the Bayes combinational method achieves higher predictive accuracy.
A Non-Lineal Time-Varying System Simulation Modeling Method Based on Neural Networks
Wang Xuefei
2006, 43(7):  1167-1172. 
Asbtract ( 417 )   HTML ( 2)   PDF (343KB) ( 507 )  
Related Articles | Metrics
Aiming at the problem of systems' simulation that input and output are all continuous time-varying functions, a new method of neural networks modeling which expands based on functions is proposed in this paper. A group of advisable basis functions is selected in continuous function space, and the input and output functions are respectively represented as the expansion form of limited basis functions within the specified precision. Neural networks constitute the conversion relationship between the expansion term coefficient of the basis function of input functions and output functions by learning the training samples. Because there is a one-to-one correlation between the input and output function and the expansion term coefficient, the insinuation relationship between the input and output of the continuous system can be carried out. The implementation methods based on Walsh conversion is given, and the effectiveness of this method is proved by tertiary oil recovery procedure simulation in oil field development.
Manifold Learning Algorithms Based on Spectral Graph Theory
Luo Siwei and Zhao Lianwei
2006, 43(7):  1173-1179. 
Asbtract ( 1165 )   HTML ( 5)   PDF (374KB) ( 1634 )  
Related Articles | Metrics
In the problem of manifold learning, one seeks to find a smooth low-dimensional manifold embedded in the high-dimensional vector space, based on a set of sample points. Spectral graph theory studies the eigenvectors and eigenvalues of matrices associated with graphs and has been widely used in the manifold learning algorithm recently. In this paper, the relationship between the manifold and the manifold learning is introduced first, and then some typical manifold learning algorithms based on spectral graph theory are studied. Finally, some directions for further research are suggested.
A Reverse Triple I Algorithm for Fuzzy Reasoning Based on Maximum Fuzzy Entropy Principle
Hou Jian, Peng Jiayin, Zhang Yuzhuo, and Zhang Chengyi
2006, 43(7):  1180-1185. 
Asbtract ( 404 )   HTML ( 0)   PDF (344KB) ( 417 )  
Related Articles | Metrics
The fuzzy degree of a fuzzy reverse triple I inference conclusion is measured by fuzzy entropy, and a fuzzy entropy reverse triple I principle is introduced. Existence conditions of solutions are discussed for the fuzzy entropy reverse triple I algorithms for FMP and FMT problems, and the computing formulas of the algorithm based on some common implication operators are provided.
Iris Recognition Based on Wavelet Transform with Shift Invariance Preprocessing
Ming Xing, Liu Yuanning, Zhu Xiaodong, and Xu Tao
2006, 43(7):  1186-1193. 
Asbtract ( 385 )   HTML ( 1)   PDF (483KB) ( 812 )  
Related Articles | Metrics
The normal discrete wavelet transform is sensitive to the initial position of asignal and can not be used to extract stable iris features. To weaken the negative effect on the recognition result due to the shift sensitivity of the wavelettransform,a shift invariance preprocessing method based on directional energy distribution sequences is proposed to correct the angular rotation of different iris textures, which corresponds to the horizontal shift of normalized iris blocks. A threshold scheme is then performed to transform all the wavelet coefficients into a binary vector to represent iris features. In verification mode, the weighted Hamming distance classifier is finally adopted to perform multi-template matching to identify the unknown feature vector. Comparison experiments show that the proposed algorithm enhances the availability of iris feature representation by wavelet transform and leads to a good performance of iris recognition.
Adaptive Skin Detection in JPEG Compressed Images
Zheng Qingfang, and Gao Wen,
2006, 43(7):  1194-1200. 
Asbtract ( 368 )   HTML ( 2)   PDF (456KB) ( 629 )  
Related Articles | Metrics
Skin region detection plays an important role in a variety of applications such as face detection and adult image filtering. To improve the accuracy and speed of skin detection, an adaptive skin detection algorithm working on compressed domain of JPEG images is proposed. Two main contributions of the skin detector are: ① It can adaptively control the threshold according to image content; ② It doesn't require full decompression of JPEG images. It directly derives color and texture features of each image block from DCT coefficients and jointly uses both features to classify each image block as human skin region or not; Comparisons with other existing skin detection techniques demonstrate that the algorithm proposed can compute very fast and achieve good accuracy.
A New Feature Extraction Method Based on Fisher Discriminant Minimal Criterion
Zheng Yujie, Yang Jingyu, Xu Yong, and Yu Dongjun
2006, 43(7):  1201-1206. 
Asbtract ( 454 )   HTML ( 2)   PDF (365KB) ( 517 )  
Related Articles | Metrics
Feature extraction is one of the hot topics in the field of pattern recognition. In this paper, a new feature extraction method called Fisher discriminant minimal criterion is proposed to improve the performance of feature extraction. Conventional Fisher discriminant criterion is inversed and null space of between-class scatter matrix is defined in this algorithm. Therefore, limitation of final eigenvectors' dimensions determined by class number is overcome and more effective classification information can be achieved. Experimental results on face databases demonstrate the effectiveness of the proposed algorithm.
Playfield Detection Using Adaptive GMM and Its Application in Sports Video Analysis
Liu Yang, Huang Qingming, Gao Wen, and Ye Qixiang
2006, 43(7):  1207-1215. 
Asbtract ( 562 )   HTML ( 0)   PDF (741KB) ( 587 )  
Related Articles | Metrics
Playfield detection is a key step in sports video content analysis, while the playfield presents different colors in broadcast video as a result of illumination variation caused by different cameras and different shooting angles, etc. To address the problem, adaptive GMM is exploited to detect playfield. The proposed algorithm first selects samples in a video clip randomly and extracts the dominant color in the histogram automatically; then the dominant color is modeled by GMM approximately. To adapt the model to the playfield color variation, the model's parameters are updated by incremental EM algorithm in the process of playfield detection. Based on the playfield's detection result, a simple and effective algorithm is proposed to classify the scene in soccer video according to the playfield distribution region in a frame. Experimental results show that the proposed algorithms are effective.
Constructing Convexity-Preserving Interpolation Curves of Hyperbolic Polynomial B-Splines Using a Shape Parameter
Chen Jun and Wang Guojin
2006, 43(7):  1216-1224. 
Asbtract ( 398 )   HTML ( 2)   PDF (553KB) ( 431 )  
Related Articles | Metrics
Using a blending factor, a parametrized singular polyline is blended with the hyperbolic polynomial B-spline curve to automatically generate a C\+2 (or G\+1) continuous hyperbolic polynomial B-spline with a shape parameter, which interpolates the given planar data points. By converting the curvature sign function of the interpolating curve into Bernstein polynomial, the nonnegativity conditions of Bernstein polynomial can be used to obtain the appropriate value of the shape parameter satisfying the convexity-preserving property of the constructed curve. The method is simple and convenient and need not to solve a system of equations or recur to a complicated iterative process, and the resulting interpolating curves possess smooth distribution of curvature. Several numerical examples are given to illustrate the correctness and validity of the algorithm.
A Dual Round-Robin Algorithm for Combined Input-Crosspoint-Queued Switches
Zheng Yanfeng, Sun Shutao, He Simin, and Gao Wen,
2006, 43(7):  1225-1232. 
Asbtract ( 431 )   HTML ( 0)   PDF (495KB) ( 420 )  
Related Articles | Metrics
The CICQ switch fabric is an ideal solution to multi-terabit switch implementation owing to its nice distributed scheduling property. Round-robin algorithms have been extensively studied because of their simplicity for hardware implementation. It is known that round-robin algorithms provide high throughput under uniform traffic; however, the performance is degraded under nonuniform traffic. In this paper, the reason for the performance degradation of the existing round-robin algorithms is pointed out and then a class of dual round-robin algorithms is proposed. For the proposed algorithms, each input arbiter is associated with dual round-robin pointers named the primary pointer and the secondary pointer respectively. The input queue corresponding to the primary pointer has the highest priority being scheduled, and the decision for updating the primary pointer can be dynamically made relying on the input queue status. When the input queue corresponding to the primary pointer is blocked, other input queues can be uniformly scheduled according to the secondary pointer position. Simulations show that the dual round-robin algorithms can significantly improve the performance of the CICQ switch under nonuniform traffic.
EDCP—A Duplication Checking Process Used in Duplication Based Resource Allocation Policies
Yang Juan, Bai Yun, and Qiu Yuhui,
2006, 43(7):  1233-1239. 
Asbtract ( 514 )   HTML ( 0)   PDF (440KB) ( 490 )  
Related Articles | Metrics
Optimizing the cost function of the resource allocation policy in heterogeneous systems is an important way to improve the systems' concurrency processing capability. Duplication based resource allocation policy has been proven to be effective in dealing with the data driving workloads. However, most of these policies overlook the trade offs between duplication cost and the incomes brought by it. The lack of an effective mechanism to balance these two aspects eventually hampers the performance improving of the whole system. EDCP, an effective duplication checking process is a checking process which can be generally used in any duplication based resource allocation policies under heterogeneous environment. EDCP can optimize the cost function by preventing the useless duplications without breaking the systems' whole benefit.
A Joint-Entropy-Based Anonymity Metrics Model with Multi-Property
Wu Zhenqiang, and Ma Jianfeng
2006, 43(7):  1240-1245. 
Asbtract ( 452 )   HTML ( 3)   PDF (360KB) ( 615 )  
Related Articles | Metrics
An anonymity metrics model with three properties based on joint entropy is given in this paper. The three properties are identifiable, linkable and traceable respectively. According to randomness and fuzziness of anonymity, a fuzzy pattern recognition model is presented based on entropy and least generalized weighted distance, which makes the membership vector of anonymity grades have a desirable dispersive property. A method is proposed for computing the balance parameter between entropy and the generalized weighted distance. It is illustrated that the presented method has advantages over Shannon entropy like models in performance. The method can determine uniquely the balance parameter defined in this paper. Therefore, joint entropy is applied to evaluate the anonymity grades.
New Braid Intractable Problems and Cryptographical Applications
Tang Xueming, Hong Fan, and Cui Guohua
2006, 43(7):  1246-1251. 
Asbtract ( 476 )   HTML ( 0)   PDF (344KB) ( 724 )  
Related Articles | Metrics
By using Shor, Boneh and Lipton's quantum algorithms, quantum computers can solve big integer factorization problems, discrete logarithm problems and discrete logarithm problems on elliptic curves, but public key cryptography systems based on these problems will become insecure in the age of quantum computers. It seems that braid group is a kind of considerable public key cryptography platform in the future. Solutions to the underlying intractable problems make all current braid cryptography systems look vulnerable. Two kinds of new intractable problems related to the p-th root finding problem and linear representation attacks are proposed to design a new key agreement protocol. Following the proposal of the parameter choice, the new protocol can resist all current known attacks.
An Efficient Approach to Intrusion Detection Based on Boosting Rule Learning
Yang Wu, Yun Xiaochun, and Li Jianhua,
2006, 43(7):  1252-1259. 
Asbtract ( 654 )   HTML ( 3)   PDF (470KB) ( 1053 )  
Related Articles | Metrics
It is an important research topic to improve detection rate and reduce false positive rate of the detection model in the field of intrusion detection. Based on the in-deep research on inductive learning theory, a rule learning algorithm is applied in building the intrusion detection model. For the case of detection precision's decline when lacking audit training data, an efficient approach to intrusion detection is proposed based on boosting rule learning (EAIDBRL). In EAIDBRL, firstly, weights of sample data in the traditional Boosting algorithm are adjusted separately within each class without changing overall class weights to eliminate deterioration in generation performance on some intrusion detection datasets; secondly, the evaluating criteria for rule growing and rule pruning of the traditional rule-learning algorithm are modified; and lastly this improved boosting algorithm is adopted to enhance generalization performance of weak rule learner on the network audit dataset. The results of experiments on the standard intrusion detection dataset indicate that EAIDBRL indeed can improve detection performance of the intrusion detection model built with the traditional rule learning algorithm.
A Security Protocols' Analytic Approach of Reconciling Two Views
Zhao Huawei and Li Daxing
2006, 43(7):  1260-1266. 
Asbtract ( 442 )   HTML ( 2)   PDF (405KB) ( 516 )  
Related Articles | Metrics
The symbol approach and the computational approach are two different approaches in security protocols' formal analysis, but the former is quite alien to the latter. The former regards protocols as a set of symbols and believes that cryptography has a secure property of “all-or-nothing”; the latter regards protocols as a set of strings, and considers that cryptography is not strong enough to resist adversary's attack. Because each approach has flaws and virtues, an idea is brought forward that is engaged in drawing virtues from both approaches and discarding their flaws to build a mature approach to analyze security protocols. A new analytic approach reconciling the two approaches is constructed following this idea. The new approach defines the security as the completeness and the correctness, through analyzing them. It not only can examine whether protocols could reach the expectant goals, but also can compute adversary's aggressive ability to protocols. The approach is the first one which puts forward the idea that analyzes security protocols from logistic reliability and cryptographic reliability.
Study on Membership Problem with Respect to Temporal Functional Dependencies and Temporal Multivalued Dependencies
Hao Zhongxiao, and Li Yanjuan
2006, 43(7):  1267-1272. 
Asbtract ( 351 )   HTML ( 1)   PDF (361KB) ( 382 )  
Related Articles | Metrics
For temporal scheme with temporal functional dependencies and temporal multivalued dependencies constraints, the usages of multiple time granularities make it more difficult to solve the membership problem. However, the solution of the membership problem is essential to design an available algorithm of scheme decomposition. Thus, in this paper, strong close set of set of temporal types, finite closure of attribution sets, finite dependency base of attribution sets based on a certain temporal type, finite dependency base of attribution sets and special finite dependency base of attribution sets are introduced; the algorithm of finite closure, finite dependency base of attribution sets and special finite dependency base of attribution sets, and the membership problem is given; and the algorithm's termination and correction are proved.
A Framework for Supporting Extended Transaction Models in J2EE Platform
Zhang Xin, Ding Xiaoning, Jin Beihong, and Li Jing
2006, 43(7):  1273-1279. 
Asbtract ( 333 )   HTML ( 2)   PDF (387KB) ( 516 )  
Related Articles | Metrics
As for the development of the Internet applications and distributed computing, the need for supporting these distributed, collaborative, loosely coupled and complex applications emerges. Although powerful, the flat model is found lacking functionality and efficiency when used for these new applications. Aiming at getting more flexible models, various extensions to the traditional transaction model have been proposed. However, these researches focus on relaxing the constraints of ACID properties and increasing the concurrency and flexibility in specific environments, and the restriction of data source interfaces leads to the difficulties in implementing the extended transaction primitives. A novel framework for supporting extended transaction models in the J2EE platform is presented by enforcing the transaction manager to control the concurrency. In this way, the transaction manager holds a full control of access from applications to data sources, and thus gives supports for common extended transaction models.
A Serializable Concurrency Control Protocol in Wireless Broadcast Enviroments
Dang Depeng, and Zhou Lizhu
2006, 43(7):  1280-1284. 
Asbtract ( 423 )   HTML ( 0)   PDF (287KB) ( 604 )  
Related Articles | Metrics
The issue of data consistency in mobile broadcast environments characterized by its asymmetric communication capacity is studied. For the protocol O-PreH, many necessary aborts need waiting for the validation on server, and therefore a lot of expensive upstream communication capacity is wasted. For the protocol PVTO, the validation condition is too strict, and thus many unnecessary aborts are induced and the degree of concurrency is restricted. In this paper, a new serializable concurrency control protocol BCC-SR is proposed. Compared with PVTO and O-PreH, the protocol BCC-SR performs better pre-validation processes on mobile hosts and their server, resulting in both less wasted resources and a smaller number of aborts. The simulation results confirm that the proposed protocol could improve the average response time of mobile transactions significantly.
Interval\++—An Index Structure on Compressed XML Data Based on Interval Tree
Bao Xiaoyuan, Tang Shiwei, and Yang Dongqing
2006, 43(7):  1285-1290. 
Asbtract ( 487 )   HTML ( 1)   PDF (394KB) ( 498 )  
Related Articles | Metrics
Even XML is used as a popular data exchange standard over the Internet and Intranet. Because of adding tags to every different semantic content unit. Its space expansion makes the transmitting and storing of XML data very expensive in terms of resources. After compressed, XML's size is much smaller, but how to evaluate query directly based on the compressed data still requires us to do some work. An XML index structure Interval\++ on compressed data is proposed, which is the result from revert arithmetic compression. Queries as the form of //element\-1/element\-2/…/elment\-m can be evaluated efficiently using Interval\++.
A Partition Fuzzy Checkpointing Strategy for Real-Time Main Memory Databases
Liao Guoqiong, Liu Yunsheng, and Xiao Yingyuan
2006, 43(7):  1291-1296. 
Asbtract ( 412 )   HTML ( 0)   PDF (372KB) ( 404 )  
Related Articles | Metrics
Checkpointing is one of the important recovery techniques of real-time main memory database systems (RTMMDBS). Through analyzing the data characteristics in RTMMDBS, a method to calculate data checkpointing priorities is presented, which takes the timing constraints of both data and transactions into consideration. A partition fuzzy checkpionting strategy based on data segment checkpointing priority—PFCS-SCP is suggested, and the correctness of PFCE-SCP is also discussed. It is shown through performance testing that PFCS-SCP strategy can decrease the missing ratio of transactions in RTMMDBS.
Problems in Results of Policy Conflict Resolutions and Detection and Resolution Methods in Network Management Systems
Li Xiangjun, Meng Luoming, and Jiao Li
2006, 43(7):  1297-1303. 
Asbtract ( 324 )   HTML ( 0)   PDF (311KB) ( 511 )  
Related Articles | Metrics
Nowadays, various methods are proposed to resolve policy conflicts in network management systems. These methods work only in limited or specific scopes, so multi-policy conflict resolution methods are often combined together to resolve policy conflicts. The processes of different policy conflict resolution methods are isolated, and the results of these methods are also isolated. So the existing results of policy conflict resolution are not taken into account in these processes. So conflicts may exist between the results of these different conflict resolution methods. The possible conflicts between results of different policy conflict resolution methods are analyzed, and detection and resolution methods are proposed to detect and resolve these conflicts. Finally the experiment results and conclusions are presented.
Denotational Semantics Transform from Continuation to Direct
Lü Jianghua, Ma Shilong, Pan Jing, and Jin Chengzhi
2006, 43(7):  1304-1308. 
Asbtract ( 453 )   HTML ( 0)   PDF (281KB) ( 453 )  
Related Articles | Metrics
The key of semantic transform is the discord of semantics functions' signature. A denotational semantics transform technology from continuation to direct is given based on the analysis between the two in the early research, and here continuations in different context are handled differently. Finally, the implementation system is described in Haskell, which tests the feasibility of the transform method.