Theoretical Foundations Of Linear And Order Statistics Combiners For Neural Pattern Classifiers
Oh la la
Your session has expired but don’t worry, your message
has been saved.Please log in and we’ll bring you back
to this page. You’ll just need to click “Send”.
Your evaluation is of great value to our authors and readers. Many thanks for your time.
When you're done, click "publish"
Only blue fields are mandatory.
Your mailing list is currently empty.
It will build up as you send messages
and links to your peers.
besides you has access to this list.
Enter the e-mail addresses of your recipients in the box below. Note: Peer Evaluation will NOT store these email addresses log in
Your message has been sent.
Full text for this article was not available? Send a request to the author(s)
: Theoretical Foundations Of Linear And Order Statistics Combiners For Neural Pattern Classifiers
Abstract : : Several researchers have experimentally shown that substantial improvements can be obtained in difficult pattern recognition problems by combining or integrating the outputs of multiple classifiers. This paper provides an analytical framework to quantify the improvements in classification results due to combining. The results apply to both linear combiners and the order statistics combiners introduced in this paper. We show that combining networks in output space reduces the variance of the actual decision region boundaries around the optimum boundary. For linear combiners, we show that in the absence of classifier bias, the added classification error is proportional to the boundary variance. For non-linear combiners, we show analytically that the selection of the median, the maximum and in general the ith order statistic improves classifier performance. The analysis presented here facilitates the understanding of the relationships among error rates, classifier boundary distributions...
: Computer Science
Leave a comment
This contribution has not been reviewed yet. review?