Many theoretical and experimental studies have shown that a multiple classi?er system is an e?ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si?ers: -Therepresentationoftheinput(whateachindividualclassi?erreceivesby wayofinput). -Thearchitectureoftheindividualclassi?ers(algorithmsandparametri- tion). - The way to cause these classi?ers to take a decision together. Itcanbeassumedthatacombinationmethodise?cientifeachindividualcl- si?ermakeserrors'inadi?erentway',sothatitcanbeexpectedthatmostofthe classi?ers can correct the mistakes that an individual one does [1,19]. The term 'weak classi?ers' refers to classi?ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g.,theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi?erseesdi?erentsectionsofthelearningset,theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi?ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e?cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi?ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi?erisrather unstable. In the present paper,westudytwodistinctwaystocreatesuchweakenedclassi?ers;i.e.learning set resampling (using the 'Bagging' approach [5]), and random feature subset selection (using 'MFS', a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi?cations to the training and/or the feature set [7,8,12,21].
"synopsis" may belong to another edition of this title.
This book constitutes the refereed proceedings of the First International Workshop on Multiple Classifier Systems, MCS 2000, held in Cagliari, Italy in June 2000. The 33 revised full papers presented together with five invited papers were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on theoretical issues, multiple classifier fusion, bagging and boosting, design of multiple classifier systems, applications of multiple classifier systems, document analysis, and miscellaneous applications.
"About this title" may belong to another edition of this title.
Seller: Majestic Books, Hounslow, United Kingdom
Condition: New. pp. 424 Illus. Seller Inventory # 5851051
Quantity: 1 available
Seller: Books Puddle, New York, NY, U.S.A.
Condition: New. pp. 424. Seller Inventory # 263078260
Seller: Biblios, Frankfurt am main, HESSE, Germany
Condition: New. pp. 424. Seller Inventory # 183078270
Seller: GuthrieBooks, Spring Branch, TX, U.S.A.
Paperback. Condition: Very Good. Ex-library paperback in very nice condition with the usual markings and attachments. Seller Inventory # DA1411791
Seller: Lucky's Textbooks, Dallas, TX, U.S.A.
Condition: New. Seller Inventory # ABLIING23Mar3113020174725
Seller: GreatBookPrices, Columbia, MD, U.S.A.
Condition: New. Seller Inventory # 919229-n
Seller: Ria Christie Collections, Uxbridge, United Kingdom
Condition: New. In. Seller Inventory # ria9783540677048_new
Quantity: Over 20 available
Seller: Chiron Media, Wallingford, United Kingdom
Paperback. Condition: New. Seller Inventory # 6666-IUK-9783540677048
Quantity: Over 20 available
Seller: GreatBookPricesUK, Woodford Green, United Kingdom
Condition: New. Seller Inventory # 919229-n
Quantity: Over 20 available
Seller: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Germany
Taschenbuch. Condition: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Many theoretical and experimental studies have shown that a multiple classi er system is an e ective technique for reducing prediction errors [9,10,11,20,19]. These studies identify mainly three elements that characterize a set of cl- si ers: Therepresentationoftheinput(whateachindividualclassi erreceivesby wayofinput). Thearchitectureoftheindividualclassi ers(algorithmsandparametri- tion). The way to cause these classi ers to take a decision together. Itcanbeassumedthatacombinationmethodise cientifeachindividualcl- si ermakeserrors inadi erentway ,sothatitcanbeexpectedthatmostofthe classi ers can correct the mistakes that an individual one does [1,19]. The term weak classi ers refers to classi ers whose capacity has been reduced in some way so as to increase their prediction diversity. Either their internal architecture issimple(e.g.,theyusemono-layerperceptronsinsteadofmoresophisticated neural networks), or they are prevented from using all the information available. Sinceeachclassi erseesdi erentsectionsofthelearningset,theerrorcorre- tion among them is reduced. It has been shown that the majority vote is the beststrategyiftheerrorsamongtheclassi ersarenotcorrelated.Moreover, in real applications, the majority vote also appears to be as e cient as more sophisticated decision rules [2,13]. Onemethodofgeneratingadiversesetofclassi ersistoupsetsomeaspect ofthetraininginputofwhichtheclassi erisrather unstable. In the present paper,westudytwodistinctwaystocreatesuchweakenedclassi ers;i.e.learning set resampling (using the Bagging approach [5]), and random feature subset selection (using MFS , a Multiple Feature Subsets approach [3]). Other recent and similar techniques are not discussed here but are also based on modi cations to the training and/or the feature set [7,8,12,21]. 424 pp. Englisch. Seller Inventory # 9783540677048