Many theoretical and experimental studies have shown that a multiple classi?er system is an e?ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si?ers: -Therepresentationoftheinput(whateachindividualclassi?erreceivesby wayofinput). -Thearchitectureoftheindividualclassi?ers(algorithmsandparametri- tion). - The way to cause these classi?ers to take a decision together. Itcanbeassumedthatacombinationmethodise?cientifeachindividualcl- si?ermakeserrors'inadi?erentway',sothatitcanbeexpectedthatmostofthe classi?ers can correct the mistakes that an individual one does [1,19]. The term 'weak classi?ers' refers to classi?ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g.,theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi?erseesdi?erentsectionsofthelearningset,theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi?ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e?cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi?ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi?erisrather unstable. In the present paper,westudytwodistinctwaystocreatesuchweakenedclassi?ers;i.e.learning set resampling (using the 'Bagging' approach [5]), and random feature subset selection (using 'MFS', a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi?cations to the training and/or the feature set [7,8,12,21].
"synopsis" may belong to another edition of this title.
This book constitutes the refereed proceedings of the First International Workshop on Multiple Classifier Systems, MCS 2000, held in Cagliari, Italy in June 2000. The 33 revised full papers presented together with five invited papers were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on theoretical issues, multiple classifier fusion, bagging and boosting, design of multiple classifier systems, applications of multiple classifier systems, document analysis, and miscellaneous applications.
"About this title" may belong to another edition of this title.
Seller: Ria Christie Collections, Uxbridge, United Kingdom
Condition: New. In. Seller Inventory # ria9783540677048_new
Quantity: Over 20 available
Seller: Chiron Media, Wallingford, United Kingdom
Paperback. Condition: New. Seller Inventory # 6666-IUK-9783540677048
Quantity: Over 20 available
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
Condition: New. Seller Inventory # 919229-n
Quantity: Over 20 available
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New. Seller Inventory # 919229-n
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Many theoretical and experimental studies have shown that a multiple classi er system is an e ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si ers: Therepresentationoftheinput(whateachindividualclassi erreceivesby wayofinput). Thearchitectureoftheindividualclassi ers(algorithmsandparametri- tion). The way to cause these classi ers to take a decision together. Itcanbeassumedthatacombinationmethodise cientifeachindividualcl- si ermakeserrors inadi erentway ,sothatitcanbeexpectedthatmostofthe classi ers can correct the mistakes that an individual one does [1,19]. The term weak classi ers refers to classi ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g.,theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi erseesdi erentsectionsofthelearningset,theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi erisrather unstable. In the present paper,westudytwodistinctwaystocreatesuchweakenedclassi ers;i.e.learning set resampling (using the Bagging approach [5]), and random feature subset selection (using MFS , a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi cations to the training and/or the feature set [7,8,12,21]. 424 pp. Englisch. Seller Inventory # 9783540677048
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. pp. 424. Seller Inventory # 263078260
Seller: Majestic Books, Hounslow, United Kingdom
Condition: New. Print on Demand pp. 424 Illus. Seller Inventory # 5851051
Quantity: 4 available
Seller: Biblios, Frankfurt am main, HESSE, Germany
Condition: New. PRINT ON DEMAND pp. 424. Seller Inventory # 183078270
Seller: moluna, Greven, Germany
Condition: New. Seller Inventory # 4898350
Quantity: Over 20 available
Seller: buchversandmimpf2000, Emtmannsberg, BAYE, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - Print on Demand Titel. Neuware -Ensemble Methods in Machine Learning.- Experiments with Classifier Combining Rules.- The 'Test and Select' Approach to Ensemble Combination.- A Survey of Sequential Combination of Word Recognizers in Handwritten Phrase Recognition at CEDAR.- Multiple Classifier Combination Methodologies for Different Output Levels.- A Mathematically Rigorous Foundation for Supervised Learning.- Classifier Combinations: Implementations and Theoretical Issues.- Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification.- Complexity of Classification Problems and Comparative Advantages of Combined Classifiers.- Effectiveness of Error Correcting Output Codes in Multiclass Learning Problems.- Combining Fisher Linear Discriminants for Dissimilarity Representations.- A Learning Method of Feature Selection for Rough Classification.- Analysis of a Fusion Method for Combining Marginal Classifiers.- A hybrid projection based and radial basis function architecture.- Combining Multiple Classifiers in Probabilistic Neural Networks.- Supervised Classifier Combination through Generalized Additive Multi-model.- Dynamic Classifier Selection.- Boosting in Linear Discriminant Analysis.- Different Ways of Weakening Decision Trees and Their Impact on Classification Accuracy of DT Combination.- Applying Boosting to Similarity Literals for Time Series Classification.- Boosting of Tree-Based Classifiers for Predictive Risk Modeling in GIS.- A New Evaluation Method for Expert Combination in Multi-expert System Designing.- Diversity between Neural Networks and Decision Trees for Building Multiple Classifier Systems.- Self-Organizing Decomposition of Functions.- Classifier Instability and Partitioning.- A Hierarchical Multiclassifier System for Hyperspectral Data Analysis.-Consensus Based Classification of Multisource Remote Sensing Data.- Combining Parametric and Nonparametric Classifiers for an Unsupervised Updating of Land-Cover Maps.- A Multiple Self-Organizing Map Scheme for Remote Sensing Classification.- Use of Lexicon Density in Evaluating Word Recognizers.- A Multi-expert System for Dynamic Signature Verification.- A Cascaded Multiple Expert System for Verification.- Architecture for Classifier Combination Using Entropy Measures.- Combining Fingerprint Classifiers.- Statistical Sensor Calibration for Fusion of Different Classifiers in a Biometric Person Recognition Framework.- A Modular Neuro-Fuzzy Network for Musical Instruments Classification.- Classifier Combination for Grammar-Guided Sentence Recognition.- Shape Matching and Extraction by an Array of Figure-and-Ground Classifiers.Springer-Verlag KG, Sachsenplatz 4-6, 1201 Wien 424 pp. Englisch. Seller Inventory # 9783540677048