KG, Germany, Language: English. Brand new Book. Decision trees and decision rule systems are widely used in different applicationsas algorithms for problem solving, as predictors, and as a way forknowledge representation. The aims of this book are i the consideration of the setsof decision trees, rules and reducts; ii study of relationships among theseobjects; iii design of algorithms for construction of trees, rules and reducts;and iv obtaining bounds on their complexity.
Applications for supervisedmachine learning, discrete optimization, analysis of acyclic programs, faultdiagnosis, and pattern recognition are considered also. This is a mixture ofresearch monograph and lecture notes. The results considered in this book can be useful for researchers in machinelearning, data mining and knowledge discovery, especially for those who areworking in rough set theory, test theory and logical analysis of data.
The bookcan be used in the creation of courses for graduate students. Seller Inventory KNV Mikhail Moshkov. Publisher: Springer , This specific ISBN edition is currently not available. View all copies of this ISBN edition:. Synopsis About this title This book explores decision trees and decision rule systems, rules and reducts, examines relationships among these objects and reviews the design of algorithms for construction of trees, rules and reducts. From the Back Cover : Decision trees and decision rule systems are widely used in different applications as algorithms for problem solving, as predictors, and as a way for knowledge representation.
Buy New Learn more about this copy. Other Popular Editions of the Same Title. Search for all books with this author and title. Customers who bought this item also bought.
Stock Image. Published by Springer Berlin Heidelberg , Berlin New paperback Quantity Available: Seller Rating:. Combinatorial Machine Learning Mikhail Moshkov. Published by Springer Berlin Heidelberg Aug New Taschenbuch Quantity Available: 1. Rheinberg-Buch Bergisch Gladbach, Germany. BuchWeltWeit Inh. Ludwig Meier e. Bergisch Gladbach, Germany. Published by Springer Hu and Cercone give a reduction algorithm using the positive region-based attribute significance as the guiding heuristic [ 3 ]. Wang et al.hukusyuu.com/profile/2020-08-11/sony-handycam-gumtree.php
Mikhail Moshkov - Citace Google Scholar
Hu et al. Susmaga considers both indiscernibility and discernibility relations in attribute reduction [ 6 ]. These categories are fast but do not guarantee to find an optimal or minimal reduction. Some researchers use stochastic methods for rough set attribute reduction. These categories are optimization methods. It has a higher probability of finding a minimum reduct than the first category. Wroblewski combines a genetic algorithm with a greedy algorithm to generate short reducts.
However, it uses highly time-consuming operations and cannot assure that the resulting subset is really a reduct [ 7 ]. Taking Wroblewski's work as a foundation, Bjorvand and Komorowski apply genetic algorithms to compute approximate reducts [ 8 ].
The algorithm makes several variations and practical improvements both in speed and in the quality of approximation. The reduct generation algorithms based on genetic algorithms for the rough set attribute reduction are quite efficient [ 9 ]. But rough set can only deal with the discrete attributes.
A method for discretization based on particle swarm optimization PSO is presented in [ 10 ]. Taking this work as a foundation, an algorithm for knowledge reduction in rough sets is proposed based on particle swarm optimization in [ 11 ]. This algorithm can solve some problems that the existing heuristic algorithm cannot solve.
In order to improve the efficiency of the algorithm, many scholars constantly improve and update these algorithms. Santana-Quintero Luis et al. The main idea of the approach is to combine the high convergence rate of the particle swarm optimization algorithm with a local search approach based on rough sets which is able to spread the nondominated solutions found.
After that, two-step particle swarm optimization to solve the feature selection problem was given by Bello et al. The improved algorithm is a method which can improve the search efficiency. Chi et al. Hsieh and Horng presented a method for feature selection based on asynchronous discrete PSO search algorithm [ 15 ]. Also, other stochastic algorithms were used to attribute reduction, for example, ant colony algorithm ACO [ 16 ] and support vector machine SVR [ 17 ].
- SELECTED LOGS (Georgian cyber non-fiction).
- College Attrition at American Research Universities: Comparative Case Studies.
- Combinatorial Machine Learning: A Rough Set Approach .
- Lexikon der soziologischen Werke.
For systems where the optimal or minimal subset is required, stochastic category may be used. In this case, this problem is transformed into a problem of finding a maximum or minimum value of a fitness function at first, and then some stochastic optimization method is applied to solve the fitness maximization or minimization problem. A common way to transform a constrained optimization problem into an unconstrained fitness optimization problem is to use penalty methods [ 18 ]. For such methods, designing a better fitness function is the most important work.
To get good performance, the fitness functions should meet the requirements that the fitness evaluation of a candidate solution is appropriate and the optimality equivalence is guaranteed.
Rough Set Theory | An Introduction
Here, the optimality equivalence means that the optimal solution of the fitness maximization problem corresponds to a minimum attribute reduction. Unfortunately, the existing fitness functions do not well meet the above mentioned requirements and consequently affect the performance of the related algorithms [ 19 ]. In this paper, an applicable fitness function was proposed. Compared with the existing fitness functions as mentioned earlier, it not only takes into account less factors but also overcomes the drawback.
The experimental results show that, for each of the two tested algorithms, the use of the proposed fitness function has a higher probability to find a minimum reduction than the use of the function proposed in [ 19 ]. The rest of the paper is organized as follows. Section 2 presents some concepts about minimum attribute reduction and reviewed and analysed a fitness function proposed in [ 19 ].
- Poppy: The Genus Papaver (MEDICINAL & AROMATIC PLANTS- INDUSTRIAL PROFILES) (Medicinal and Aromatic Plants - Industrial Profiles)?
- Combinatorial Machine Learning: A Rough Set Approach;
- Sugata Saurabha An Epic Poem from Nepal on the Life of the Buddha by Chittadhar Hridaya.
In Section 3 , a new fitness function and properties are presented. In Section 4 , the results of experiments and comparison analysis are given. Finally, Section 5 concludes the paper. In this section, we will review some basic notions in the theory of rough sets which are necessary for the description of the minimum attribute reduction problem. For attribute reduction, the minimal attribute reduction with minimal cardinality will be searched.
The minimal attribute reduction problem can be formulated as the following nonlinearly combinational optimization problem:. By the definition of Red m , the following proposition is apparent. According to Proposition 1 , each element of Red m corresponds to a minimal reduction. In order to solve problem 3 , the most commonly used approach is to transform it into the following unconstrained maximization problem and to solve it by heuristic algorithms:. For this optimization problem, the equivalence of optimality between the minimum attribute reduction problem 3 and the fitness maximization of the function F R must be guaranteed.
Unfortunately, most of the functions do not satisfy the requirement in the literatures [ 19 ].