Publications
List of publications on Boosting
- Z.-H. Zhou, Y. Jiang, and S.-F. Chen. Extracting symbolic rules from trained neural network ensembles. AI Communications, in press. (PDF)
- R. Meir and G. Rätsch. An introduction to boosting and leveraging. In S. Mendelson and A. Smola, editors, Advanced Lectures on Machine Learning, LNCS, pages 119-184. Springer, 2003. In press. Copyright by Springer Verlag. (PDF)
- F. Audrino and P. Bühlmann. Volatility estimation with functional gradient descent for very high-dimensional financial time series. Journal of Computational Finance., 2002. To appear. See http://stat.ethz.ch/~buhlmann/bibliog.html.
- N. Cesa-Bianchi, A. Conconi, and C. Gentile. A second-order perceptron algorithm. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 121-137, Sydney, February 2002. Springer. Copyright by Springer.
- M. Dettling and P. Bühlmann. How to use boosting for tumor classification with gene expression data. Preprint. See tt http://stat.ethz.ch/~dettling/boosting, 2002.
- Y. Freund and M. Opper. Drifting games and Brownian motion. Journal of Computer and System Sciences, 64:113-132, 2002.
- Takafumi Kanamori. A new sequential algorithm for regression problems by using mixture distribution. 2002.
- Samuel Kutin and Partha Niyogi. Almost-everywhere algorithmic stability and generalization error. Technical Report TR-2002-03, Department of Computer Science, The University of Chicago, 2002.
- P.M. Long. Minimum majority classification and boosting. In AAAI, 2002.
- Gábor Lugosi and Nicolas Vayatis. A consistent strategy for boosting algorithms. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 303-318, Sydney, February 2002. Springer.
- C. Mesterharm. Tracking linear-threshold concepts with winnow. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 138-152, Sydney, February 2002. Springer. Copyright by Springer.
- G. Rätsch and M.K. Warmuth. Maximizing the margin with boosting. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 334-350, Sydney, February 2002. Springer. Copyright by Springer. (PDF)
- G. Rätsch and M.W. Warmuth. Efficient margin maximization with boosting. submitted to JMLR, December 2002. (PDF)
- G. Rätsch, A. Demiriz, and K. Bennett. Sparse regression ensembles in infinite and finite hypothesis spaces. Machine Learning, 48(1-3):193-221, 2002. Special Issue on New Methods for Model Selection and Model Combination. Also NeuroCOLT2 Technical Report NC-TR-2000-085. (PDF)
- G. Rätsch, S. Mika, B. Schölkopf, and K.-R. Müller. Constructing boosting algorithms from SVMs: an application to one-class classification. IEEE PAMI, 24(9), September 2002. Earlier version is GMD TechReport No. 119, 2000. (PDF)
- G. Rätsch, S. Mika, and M.K. Warmuth. On the convergence of leveraging. In T.G. Dietterich, S. Becker, and Z. Ghahramani, editors, Advances in Neural information processings systems, volume 14, 2002. In press. Longer version also NeuroCOLT Technical Report NC-TR-2001-098. (PDF)
- M. Rochery, R. Schapire, M. Rahim, N. Gupta, G. Riccardi, S. Bangalore, H. Alshawi, and S. Douglas. Combining prior knowledge and boosting for call classification in spoken language dialogue. In International Conference on Accoustics, Speech and Signal Processing, 2002.
- R. Meir S. Mannor and T. Zhang. The consistency of greedy algorithms for classification. In Proceedings of the Annual Conference on Computational Learning Theory, volume 2375 of LNAI, pages 319-333, Sydney, February 2002. Springer. Copyright by Springer, Berlin.
- R.E. Schapire. The boosting approach to machine learning: An overview. In Workshop on Nonlinear Estimation and Classification. MSRI, 2002.
- X. Sun. Pitch accent prediction using ensemble machine learning, 2002.
- Z.-H. Zhou, Y. Jiang, Y.-B. Yang, and S.-F. Chen. Lung cancer cell identification based on artificial neural network ensembles. Artificial Intelligence in Medcine, 24(1):25-36, 2002. (PDF)
- Z.-H. Zhou, Y. Jiang, Y.-B. Yang, and S.-F. Chen. Lung cancer cell identification based on artificial neural network ensembles. Artificial Intelligence in Medicine, 24(1):25-36, 2002.
- Z.-H. Zhou, J .Wu, and W. Tang. Ensembling neural networks: many could be better than all. Artificial Intelligence, 137(1-2):239-263, 2002. (PDF)
- O. Bousquet and A. Elisseeff. Algorithmic stability and generalization performance. In Advances in Neural Information Processing Systems 13: Proc. NIPS'2000, 2001. (PDF)
- N. Bshouty and D. Gavinsky. On boosting with optimal poly-bounded distributions. In Proc. COLT, 2001.
- N. Cesa-Bianchi and G. Lugosi. Potential-based algorithms in on-line prediction and game theory. In Proc. COLT, 2001.
- A. Demiriz, K.P. Bennett, and J. Shawe-Taylor. Linear programming boosting via column generation. Machine Learning Journal, 2001. To appear in special issue on Support Vector Machines and Kernel Methods (N. Cristianini, C. Campbell and C. Burges Eds.). (PDF)
- J.J. Rodriguez Diez and C.J. Alonso Gonzalez. Learning classification rbf networks by boosting. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 32-42. Springer, 2001.
- Y. Freund. An adaptive version of the boost by majority algorithm. Machine Learning, 43(3):293-318, June 2001.
- Y. Freund, Y. Mansour, and R.E. Schapire. Why averaging classifiers can protect against overfitting. In Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, 2001. (PDF)
- Y. Grandvalet. Bagging can stabilize without reducing variance. In ICANN'01, Lecture Notes in Computer Science. Springer, 2001.
- P. Grünwald. Strong entropy concentration, game theory and algorithmic randomness. In Proc. COLT, 2001.
- T. Hastie, R. Tibshirani, and J. Friedman. The Elements of Statistical Learning: data mining, inference and prediction. Springer, 2001.
- P.A.d.F.R. Højen-Sørensen, O. Winther, and L.K.-Hansen. Ensemble learning and linear response theory for ica. In Advances in Neural Information Processing Systems 13: Proc. NIPS'2000, 2001. (PDF)
- Wenxin Jiang. Some theoretical aspects of boosting in the presence of noisy data. Technical Report 01-01, Department of Statistics, Northwestern University, 2001. To appear in Proceedings: The Eighteenth International Conference on Machine Learning (ICML-2001), June 2001, Morgan Kaufmann. (PDF)
- B. Kegl, T. Linder, and G. Lugosi. Data-dependent margin-based generalization bounds for classification. In Proc. COLT, 2001.
- V. Koltchinskii, D. Panchenko, and F. Lozano. Further explanation of the effectiveness of voting methods: The game between margins and weights. In Proc. COLT, 2001.
- V. Koltchinskii, D. Panchenko, and F. Lozano. Some new bounds on the generalization error of combined classifiers. Advances in Neural Information Processing Systems 13: Proc. of NIPS'2000, 2001.
- S. Kutin and P. Niyogi. The interaction of stability and weakness in adaboost. Technical Report TR-2001-30, University of Chicago Department of Computer Science, 2001. (PDF)
- G. Lebanon and J. Lafferty. Boosting and maximum likelihood for exponential models. In Neural Information Processing Systems (NIPS), volume 14, 2001.
- G. Lugosi and N. Vayatis. On the bayes-risk consistency of boosting methods. Technical report, Department of Economics, Pompeu Fabra University, Spain, 2001.
- S. Mannor and R. Meir. Geometric bounds for generalization in boosting. In Proceedings of the Fourteenth Annual Conference on Computational Learning Theory, 2001. (PDF)
- S. Mannor and R. Meir. Weak learners and improved convergence rate in boosting. In Advances in Neural Information Processing Systems 13: Proc. NIPS'2000, 2001. (PDF)
- S. Mannor, R. Meir, and S. Mendelson. On the consistency of boosting algorithms. submitted to Advances in Neural Information Processing 14, June 2001. (PDF)
- S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning cost-sensitive boosting and its application to melanoma diagnosis. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 32-42. Springer, 2001.
- S. Merler, C. Furlanello, B. Larcher, and A. Sboner. Tuning cost-sensitive boosting and its application to melanoma diagnosis. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 32-42. Springer, 2001.
- P.J. Moreno, B. Logan, and B. Raj. A boosting approach for confidence scoring. In Eurospeech 2001, September 2001.
- Vladimir Pavlovic and Ashutosh Garg. Efficient detection of objects and attributes using boosting. In IEEE Conf. Computer Vision and Pattern Recognition - Technical Sketches, Kauai, HI, December 2001.
- A. Piccolboni and C. Schindelhauer. Discrete prediction games with arbitrary feedback and loss. In Proc. COLT, 2001.
- G. Rätsch. Robust Boosting via Convex Optimization. PhD thesis, University of Potsdam, October 2001. (PDF)
- G. Rätsch and M.K. Warmuth. Marginal boosting. NeuroCOLT2 Technical Report 97, Royal Holloway College, London, July 2001. (PDF)
- G. Rätsch, S. Mika, and M.K. Warmuth. On the convergence of leveraging. NeuroCOLT2 Technical Report 98, Royal Holloway College, London, August 2001. (PDF)
- G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Machine Learning, 42(3):287-320, March 2001. also NeuroCOLT Technical Report NC-TR-1998-021. (PDF)
- P.M. Long S. Ben-David and Y. Mansour. Agnostic boosting. In Proc. COLT, 2001.
- R.E. Schapire. Drifting games. Machine Learning, 2001. to appear. (PDF)
- R.A. Servedio. Smooth boosting an learning with malicious noise. In Proc. COLT, 2001.
- M. Skurichina and R.P.W. Duin. Bagging and the random subspace method for redudant feature spaces. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 1-10. Springer, 2001.
- J. Suykens, J. Vandewalle, and B. De Moor. Intelligence and cooperative search by coupled local minimizers. International Journal of Bifurcation and Chaos, 11(8):2133-2144, 2001.
- E. Tapia, J.C. Gonzalez, and J. Villena. A generalized class of boosting algorithms based on recursive decoding models. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 22-31. Springer, 2001.
- V. Tresp. Committee machines. In Handbook on neural Network Signal Processing. CRC Press, 2001. (PDF)
- Volker Tresp. Mixtures of gaussian processes. In Advances in Neural Information Processing Systems 13: Proc. of NIPS'00, volume 13, 2001.
- P. Viola and M. Jones. Robust real-time object recognition. In Proc. ICCV, 2001.
- M.A. Walker, O. Rambow, and M. Rogati. Spot: A trainable sentence planner. In Proc. 2nd Annual Meeting of the North American Chapter of the Assiciation for Computational Linguistics, 2001.
- J. Wickramaratna, S. Holden, and B. Buxton. Performance degradation in boosting. In J. Kittler and F. Roli, editors, Proceedings of the 2nd Internationa Workshop on Multiple Classifier Systems MCS2001, volume 2096 of LNCS, pages 11-21. Springer, 2001.
- L. Yan and D. J. Miller. Critic-driven ensemble classification via a learning method akin to boosting. In Intelligent Engineering Systems Through Artificial Neural Networks 11, pages 27-32, 2001.
- H. Yoshii. A big mistake concerning boosting. In Information-Based Induction Sciences IBIS2001, pages 285-290, 2001.
- R.S. Zemel and T. Pitassi. A gradient-based boosting algorithm for regression problems. In NIPS-13: Advances in Neural Information Processing Systems, 13, Cambridge, MA, 2001. MIT Press. (PDF)
- T. Zhang. Statistical behavior and consistency of classification methods based on convex risk minimization. Technical Report rc22155, IBM Research, Yorktown Heights, NY, 2001. (PDF)
- E.L. Allwein, R.E. Schapire, and Y. Singer. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research, 1:113-141, 2000. (PDF)
- K.P. Bennett, A. Demiriz, and J. Shawe-Taylor. A column generation algorithm for boosting. In Pat Langley, editor, Proceedings of Seventeenth International Conference on Machine Learning (ICML'2000), pages 65-72. Morgan Kaufmann, 2000. (PDF)
- L. Breiman. Some infinite theory for predictor ensembles. Technical Report 577, Statistics Department, UC Berkeley, August 2000. (PDF)
- M. Collins, R.E. Schapire, and Y. Singer. Logistic regression, AdaBoost and bregman distances. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000. (PDF)
- H. Drucker. Effect of pruning and early stopping stopping on performance of a boosted ensemble. In Proceedings of the International Meeting on Nonlinear Methods and Data Mining, pages 26-40, Rome, Italy, 2000. (PDF)
- N. Duffy and D. Helmbold. Leaveraging for regression. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000. (PDF)
- N. Duffy and D. Helmbold. Potential boosters?. In S.A. Solla, T.K. Leen, and K.-R. Müller, editors, Advances in Neural Information Processing Systems 12, pages 258-264. MIT Press, 2000. (PDF)
- G. Escudero, L. Màrquez, and G. Rigau. Boosting applied to word sense disambiguation. In LNAI 1810: Proceedings of the 12th European Conference on Machine Learning, ECML, pages 129-141, Barcelona, Spain, 2000. (PDF)
- D. Freitag and N. Kushmerick. Boosted wrapper induction. In Proc. 17th Nat. Conf. Artificial Intelligence (AAAI), pages 577-583, 2000. (PDF)
- Y. Freund and R.E. Schapire. Discussion of the paper ``additive logistic regression: a statistical view of boosting'' by J. Friedman, T. Hastie and R. Tibshirani. The Annals of Statistics, 38(2):391-293, 2000. (PDF)
- F.J. Huang, Z.-H. Zhou, H.-J. Zhang, and T. Chen. Pose invariant face recognition. In Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pages 245-250, Grenoble, France, 2000.
- R.D. Iyer, D.D. Lewis, R.E. Schapire, Y. Singer, and A. Singhal. Boosting for document routing. In Proceedings of the Ninth International Conference on Information and Knowledge Management, 2000. (PDF)
- W. Jiang. Does boosting overfit: Views from an exact solution. Technical Report 00-04, Department of Statistics, Northwestern University, September 2000.
- W. Jiang. Is regularization unnecessary for boosting?. Technical Report 00-04, Department of Statistics, Northwestern University, November 2000. To appear in Proceedings: The Eighth International Workshop on Artificial Intelligence and Statistics, January 2001, Morgan Kaufmann.
- W. Jiang. On weak base hypotheses and their implications for boosting regression and classification. Technical Report 00-01, Department of Statistics, Northwestern University, October 2000. Former title: ``Large Time Behavior of Boosting Algorithms for Regression and Classification''.
- W. Jiang. Process consistency for adaboost. Technical Report 00-05, Department of Statistics, Northwestern University, November 2000.
- W. Jiang. Some results on weakly accurate base learners for boosting regression and classification. In Proceedings of the First International Workshop on Multiple Classifier Systems, Cagliari, Italy, June 2000., volume 1857 of Lecture Notes in Computer Science, pages 87-96. Springer, 2000.
- A. Kolcz. N-tuple network, cart and bagging. Neural Computation, 12(2):293-304, 2000.
- V. Koltchinskii and D. Panchenko. Bounding the generalization error of neural networks and combined classifiers. In Proc. of the Second ICSC Symposium on Neural Computation, Berlin, 2000.
- V. Koltchinskii and D. Panchenko. Empirical margin distributions and bounding the generalization error of combined classifiers. Submitted, 2000.
- V. Koltchinskii, D. Panchenko, and F. Lozano. Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. Submitted, 2000.
- Y. Liu, X. Yao, and T. Higuchi. Evolutionary ensembles with negative correlation learning. IEEE-EC, 4(4):380, November 2000.
- S. Mannor and R. Meir. On the existence of weak learners and applications to boosting. Submitted to Machine Learning, August 2000. (PDF)
- M.C. Mozer, R. Wolniewicz, D. Grimes, E. Johnson, and H. Kaushanksy. Predicting subscriber dissatisfaction and improving retention in the wireless telecommunications industry. IEEE Transactions on Neural Networks, 11:690-696, 2000.
- T. Onoda, G. Rätsch, and K.-R. Müller. Applying support vector machines and boosting to a non-intrusive monitoring system for household electric appliances with inverters. In Proceedings of NC'2000, 2000.
- Vladimir Pavlovic, Ashutosh Garg, and James M. Rehg. Multimodal speaker detection using error feedback dynamic bayesian networks. In IEEE Conf. Computer Vision and Pattern Recognition, Hilton Head Island, SC, June 2000.
- G. Rätsch, A. Demiriz, and K. Bennett. Sparse regression ensembles in infinite and finite hypothesis spaces. NeuroCOLT2 Technical Report 2000-085, Royal Holloway College, London, September 2000. accepted for publication in the Machine Learning journal special issue on ``New Methods for Model Selection and Model Combination''. (PDF)
- G. Rätsch, B. Schökopf, A. Smola, S. Mika, T. Onoda, and K.-R. Müller. Robust ensemble learning. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 207-219. MIT Press, Cambridge, MA, 2000. (PDF)
- G. Rätsch, B. Schökopf, A. Smola, K.-R. Müller, T. Onoda, and S. Mika. nu -arc: Ensemble learning in the presence of outliers. In D.A. Cohn M.S. Kearns, S.A. Solla, editor, Advances in Neural Information Processing Systems 12: Proc. of NIPS'99. MIT Press, 2000. (PDF)
- G. Rätsch, B. Schölkopf, S. Mika, and K.-R. Müller. SVM and Boosting: One class. Technical Report 119, GMD FIRST, Berlin, November 2000.
- G. Rätsch, M.K. Warmuth, S. Mika, T. Onoda, S. Lemm, and K.-R. Müller. Barrier boosting. In Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000. (PDF)
- Ran El-Yaniv Ron Meir and Shai Ben-David. Localized boosting. In Proceedings of the 13th Annual Conference on Computational Learning Theory, pages 190-199, 2000. (PDF)
- R.E. Schapire and Y. Singer. BoosTexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135-168, May/June 2000. (PDF)
- R.E. Schapire and Y. Singer. Boostexter: A boosting-based system for text categorization. Machine Learning, 39(2/3):135-168, 2000.
- H. Schwenk and Y. Bengio. Boosting neural networks. Neural Computation, 12(8):1869-1887, 2000.
- Fabrizio Sebastiani, Alessandro Sperduti, and Nicola Valdambrini. An improved boosting algorithm and its application to automated text categorization. In Arvin Agah, Jamie Callan, and Elke Rundensteiner, editors, Proceedings of CIKM-00, 9th ACM International Conference on Information and Knowledge Management, pages 78-85, McLean, US, 2000. ACM Press, New York, US. (PDF)
- J. Shawe-Taylor and G. Karakoulas. Towards a strategy for boosting regressors. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 247-258, Cambridge, MA, 2000. MIT Press.
- K. Tieu and P. Viola. Boosting image retrieval. In Proceedings IEEE Conf. on Computer Vision and Pattern Recognition, 2000.
- Volker Tresp. A bayesian committee machine. Neural Computation, 12(11):2719-2741, 2000.
- Volker Tresp. The generalized bayesian committee machine. In Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD-2000, pages 130-139, 2000.
- A. Tsymbal and S. Puuronen. Bagging and boosting with dynamic integration of classifiers. In Proceedings of PKDD 2000, Lyon, France, Lecture Notes in Artificial Intelligence, Springer Verlag, volume 1910, pages 116-125, 2000. (PDF)
- A. Utsugi. Bayesian sampling and ensemble learning in generative topographic mapping. Neural Processing Letters, 12(3):277-290, 2000.
- G.I. Webb. Multiboosting: A technique for combining boosting and wagging. Machine Learning, 40(2):1, 2000.
- X. Zeng and T.R. Martinez. Using a neural network to approximate an ensemble of classifiers. Neural Processing Letters, 12(3):225-237, 2000.
- G. Zweig and M. Padmanabhan. Boosting gaussian mixtures in an lvcsr system. In Proc. of ICML, 2000. (PDF)
- S. Abney, R.E. Schapire, and Y. Singer. Boosting applied to tagging and pp attachment. In Proceedings of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora, 1999.
- S. Abney, R.E. Schapire, and Y. Singer. Boosting applied to tagging and pp attachment. In Proc. of the Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora, 1999.
- R. Avnimelech and N. Intrator. Boosted mixture of experts: An ensemble learning scheme. Neural Computation, 11:483-497, 1999.
- R. Avnimelech and N. Intrator. Boosted mixture of experts: An ensemble learning scheme. Neural Computation, 11(2):483-497, 1999.
- R. Avnimelech and N. Intrator. Boosting regression estimators. Neural Computation, 11:491-513, 1999. (PDF)
- E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithm: Bagging, boosting and variants. Machine Learning, pages 105-142, 1999. (PDF)
- L. Breiman. Random forests-random features. Technical Report 567, Statistics Department, University of California, September 1999. (PDF)
- W.W. Cohen, R.E. Schapire, and Y. Singer. Learning to order things. In Advances in Neural Information Processing Systems 11: Proc. of NIPS'98. MIT Press, 1999. (PDF)
- T.G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, 1999. (PDF)
- T.G. Dietterich. Machine learning research: Four current directions. AI Magazine, 18(4):97-136, 1999. (PDF)
- H. Drucker. Boosting using neural nets. In A.J.C. Sharkey, editor, Combining Artificial Neural Nets: Ensemble and Modular Learning, pages 51-77. Springer, 1999. (PDF)
- N. Duffy and D. Helmbold. A geometric approach to leveraging weak learners. In 4th European Conference on Computational Learning Theory. LNCS, 1999. (PDF)
- W. Fan, S.J. Stolfo, and J. Zhang. The application of AdaBoost for distributed, scalable and on-line learning. unpublished manuscript, 1999. (PDF)
- W. Fan, S.J. Stolfo, J. Zhang, and P.K. Chang. Adacost: Misclassification cost-sensitive boosting. In Proc. of ICML, 1999. (PDF)
- Y. Freund and R.E. Schapire. Adaptive game playing using multiplicative weights. Games and Economic Behaviour, 29:79-103, 1999. (PDF)
- J.H. Friedman. Greedy function approximation: A gradient boosting machine. Technical report, Department of Statistics, Stanford University, February 1999. (PDF)
- V. Guruswami and A. Sahai. Multiclass learning, boosting, and error-correcting codes. In Proc. COLT'99, 1999.
- M. Haruno, S. Shirai, and Y. Ooyama. Using decision trees to construct a practical parser. Machine Learning, 34:131-149, 1999.
- J. Kivinen and M. Warmuth. Boosting as entropy projection. In Proc. COLT'99, 1999. (PDF)
- Y. Liu and X. Yao. Ensemble learning via negative correlation. Neural Networks, 12(10):1399-1404, 1999.
- L. Mason, P. Bartlett, and J. Baxter. Direct optimization of margins improves generalization in combined classifiers. In Advances in Neural Information Processing Systems 11: Proc. NIPS'1998, pages 288-294, 1999.
- L. Mason, J. Baxter, P.L. Bartlett, and M. Frean. Functional gradient techniques for combining hypotheses. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 221-247. MIT Press, Cambridge, MA, 1999.
- P. Moerland and E. Mayoraz. Dynaboost: Combining boosted hypotheses in a dynamic way. Technical Report RR 99-09, IDIAP, Switzerland, May 1999. (PDF)
- D. Opitz and R. Maclin. Popular ensemble methods: An empirical study. Journal of AI Research, 11:169-198, 1999. (PDF)
- B. Igelnikand Y.-H. Pao, S.R. LeClair, and C.Y. Shen. The ensemble approach to neural-network learning and generalization. IEEE-NN, 10(1):19, January 1999.
- G. Rätsch, T. Onoda, and K.-R. Müller. Regularizing AdaBoost. In M.S. Kearns, S.A. Solla, and D.A. Cohn, editors, Advances in Neural Information Processing Systems 11:Proc. of NIPS'98, pages 564-570. MIT Press, 1999. (PDF)
- R.E. Schapire. A brief introduction to boosting. In Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence, 1999. (PDF)
- R.E. Schapire. Theoretical views of boosting. In Computational Learning Theory: Fourth European Conference, EuroCOLT'99, 1999. (PDF)
- R.E. Schapire. Theoretical views of boosting and applications. In Tenth International Conference on Algorithmic Learning Theory, 1999. (PDF)
- R.E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. Machine Learning, 37(3):297-336, December 1999. (PDF)
- H. Schwenk. Using boosting to improve a hybrid hmm/neural network speech recognizer. In Proc. IEEE Int. Conf. on Acoustics, Speech and Signal processing, pages 1009-1012, 1999.
- H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks for character recognition. In Advances in Neural Information Processing Systems 11: Proc. of NIPS'98, 1999.
- A. Smola, B. Schölkopf, and G. Rätsch. Linear programs for automatic accuracy control in regression. In Proceedings ICANN'99, Int. Conf. on Artificial Neural Networks, Berlin, 1999. Springer.
- D.H. Wolpert and W.G. Macready. An efficient method to estimate baggings' generalization error. Machine Learning, 35:41, 1999.
- Y.Freund and R.E. Schapire. A short introduction to boosting. Journal of Japanese Society for Artificial Intelligence, 14(5):771-780, September 1999. Appearing in Japanese, translation by Naoki Abe. (PDF)
- L. Breiman. Half & half bagging and hard boundary points. Technical Report 534, Statistics Department, University of California, September 1998. (PDF)
- L. Breiman. Randomizing outputs to increase prediction accuracy. Technical Report 518, Statistics Department, University of California, May 1998. (PDF)
- M. Frean and T. Downs. A simple cost function for boosting. Technical report, Dep. of Computer Science and Electrical Engineering, University of Queensland, 1998. (PDF)
- Y. Freund and R.E. Schapire. Discussion of the paper ``arcing classifiers'' by Leo Breiman. The Annals of Statistics, 26(3):824-832, 1998. (PDF)
- Y. Freund, R. Iyer, R.E. Schapire, and Y. Singer. An efficient boosting algorithm for combining preferences. In Proc. 15th International Conference on Machine Learning, 1998. (PDF)
- Y. Freund, R. Iyer, R.E. Schapire, and Y. Singer. An efficient boosting algorithm for combining preferences. In Proc. ICML, 1998.
- J. Friedman, T. Hastie, and R. Tibshirani. Additive logistic regression: a statistical view of boosting. Technical report, Department of Statistics, Sequoia Hall, Stanford Univerity, July 1998.
- Y. Grandvalet. Least absolute shrinkage is equivalent to quadratic penalization. In International Conference on Artificial Neural Networks, 1998.
- A.J. Grove and D. Schuurmans. Boosting in the limit: Maximizing the margin of learned ensembles. In Proceedings of the Fifteenth National Conference on Artifical Intelligence, 1998. (PDF)
- R. Maclin. Boosting classifiers regionally. In Proc. of AAAI, 1998. (PDF)
- L. Mason, P. Bartlett, and J. Baxter. Improved generalization through explicit optimization of margins. Technical report, Deparment of Systems Engineering, Australian National University, 1998.
- T. Onoda, G. Rätsch, and K.-R. Müller. An asymptotic analysis of AdaBoost in the binary classification case. In L. Niklasson, M. Bodén, and T. Ziemke, editors, Proc. of the Int. Conf. on Artificial Neural Networks (ICANN'98), pages 195-200, March 1998. (PDF)
- J.R. Quinlan. Miniboosting decision trees. Journal of AI Research, 1998. (PDF)
- G. Rätsch. Ensemble learning methods for classification. Master's thesis, Dep. of Computer Science, University of Potsdam, April 1998. In German. (PDF)
- G. Rätsch, T. Onoda, and K.-R. Müller. An improvement of AdaBoost to avoid overfitting. In Proc. of the Int. Conf. on Neural Information Processing (ICONIP), pages 506-509, Kitakyushu, Japan, May 1998. (PDF)
- G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Technical Report NC-TR-1998-021, Department of Computer Science, Royal Holloway, University of London, Egham, UK, 1998. (PDF)
- R.E. Schapire and Y. Singer. Improved boosting algorithms using confidence-rated predictions. In Proceedings of the Eleventh Annual Conference on Computational Learning Theory, pages 80-91, 1998.
- R.E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics, 26(5):1651-1686, October 1998. (PDF)
- R.E. Schapire, Y. Singer, and A. Singhal. Boosting and Rocchio applied to text filtering. In Proceedings of the 21st Annual International Conference on Research and Development in Information Retrieval, 1998. (PDF)
- R.E. Schapire, Y. Singer, and A. Singhal. Boosting and rocchio applied to text filtering. In Proc. 21st Annual International Conference on Research and Development in Information Retrieval, 1998.
- A. Bertoni, P. Campadelli, and M. Parodi. A boosting algorithm for regression. In W.Gerstner, A.Germond, M.Hasler, and J.-D. Nicoud, editors, Proceedings ICANN'97, Int. Conf. on Artificial Neural Networks, volume V of LNCS, pages 343-348, Berlin, 1997. Springer.
- L. Breiman. Arcing the edge. Technical Report 486, Statistics Department, University of California, June 1997. (PDF)
- L. Breiman. Pasting bites together for prediction in large data sets. Technical report, Statistics Department, University of California, July 1997. (PDF)
- L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of California, December 1997. (PDF)
- P. Domingos. Why does bagging work? A bayesian account and its implications. In David Heckerman, Heikki Mannila, Daryl Pregibon, and Ramasamy Uthurusamy, editors, Proceedings of the Third International Conference on Knowledge Discovery and Data Mining (KDD-97), page 155. AAAI Press, 1997. (PDF)
- H. Drucker. Improving regressors using boosting techniques. In Proceedings of the Fourteenth International Conference on Machine Learning, pages 107-115. Morgan Kaufmann, 1997. (PDF)
- Y. Freund and R.E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119-139, August 1997. (PDF)
- Y. Freund, R.E. Schapire, Y. Singer, and M. K. Warmuth. Using and combining predictors that specialize. In Proceedings of the Twenty-Ninth Annual ACM Symposium on Theory of Computing, pages 334-343, El Paso, Texas, 4-6 1997.
- T. Heskes. Balancing between bagging and bumping. In M.C. Mozer, M.I. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9: Proc. of NIPS'96, pages 466-472, Cambridge, 1997. MIT Press.
- C. Ji and S. Ma. Combinations of weak classifiers. IEEE Transactions on Neural Networks, 8(1):32-42, January 1997.
- J. Kivinen and M.K. Warmuth. Additive versus exponentiated gradient updates for linear prediction. Information and Computation, 132(1):1-64, 1997. (PDF)
- P.R. Lajbcygier and J.T. Connor. Improved option pricing using artificial neural networks and bootstrap methods. International journal of neural systems, 8(4):457-471, 1997. (PDF)
- R. Maclin and D. Opitz. An empirical evaluation of bagging and boosting. In Proc. of AAAI, 1997. (PDF)
- D.D. Margineantu and T.G. Dietterich. Pruning adaptive boosting. In Proc. 14th International Conference on Machine Learning, pages 211-218. Morgan Kaufmann, 1997.
- R.E. Schapire. Using output codes to boost multiclass learning problems. In Machine Learning: Proceedings of the Fourteenth International Conference, pages 313-321, 1997. (PDF)
- R.E. Schapire, Y. Freund, P. Bartlett, and W.S. Lee. Boosting the margin: a new explanation for the effectiveness of voting methods. In Proc. 14th International Conference on Machine Learning, pages 322-330. Morgan Kaufmann, 1997.
- H. Schwenk and Y. Bengio. AdaBoosting neural networks. In W. Gerstner, A. Germond, M. Hasler, and J.-D. Nicoud, editors, Proc. of the Int. Conf. on Artificial Neural Networks (ICANN'97), volume 1327 of LNCS, pages 967-972, Berlin, 1997. Springer. (PDF)
- H. Schwenk and Y. Bengio. Adaptive boosting of neural networks for character recognition. Technical Report 1072, Department d'Informatique et Recherche Operationelle, Univerite de Montreal, may 1997. (PDF)
- L. Breiman. Bagging predictors. Machine Learning, 26(2):123-140, 1996. (PDF)
- L. Breiman. Bias, variance, and arcing classifiers. Technical Report 460, Statistics Department, University of California, April 1996. (PDF)
- H. Drucker and C. Cortes. Boosting decision trees. In D.S. Touretzky, M.C. Mozer, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8: Proc. of NIPS'95, volume 8, pages 479-485. The MIT Press, 1996. (PDF)
- Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148-146. Morgan Kaufmann, 1996. (PDF)
- Y. Freund and R.E. Schapire. Game theory, on-line prediction and boosting. In Proc. 9th Annu. Conf. on Comput. Learning Theory, pages 325-332. ACM Press, New York, NY, 1996. (PDF)
- J.R. Quinlan. Bagging, boosting, and C4.5. In Proceedings of the Thirteenth National Conference on Artificial Intelligence and the Eighth Innovative Applications of Artificial Intelligence Conference, pages 725-730, Menlo Park, August4-8 1996. AAAI Press / MIT Press. (PDF)
- J.R. Quinlan. Boosting first-order learning. In S. Arikawa and A.K. Sharma, editors, Proceedings of the 7th International Workshop on Algorithmic Learning Theory, volume 1160 of LNAI, pages 143-155, Berlin, October23-25 1996. Springer. (PDF)
- P. Sollich and A. Krogh. Learning with ensembles: How overfitting can be useful. In D.S. Touretzky, M.C. Mozer, and M.E. Hasselmo, editors, Advances in Neural Information Processing Systems 8: Proc. of NIPS'95, pages 190-196. The MIT Press, 1996.
- T.G. Dietterich and G. Bakiri. Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research, 2:263-286, 1995.
- S. Floyd and M. Warmuth. Sample compression, learnability, and the Vapnik-Chervonenkis dimension. Machine Learning, 21(3):269-304, 1995. (PDF)
- Y. Freund. Boosting a weak learning algorithm by majority. Information and Computation, 121(2):256-285, September 1995. (PDF)
- Y. Freund and R.E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. In Proc. of the Second European Conference on Computational Learning Theory. LNCS, march 1995. (PDF)
- A. Krogh and J. Vedelsby. Neural network ensembles, cross validation, and active learning. In G. Tesauro, D. Touretzky, and T. Leen, editors, Advances in Neural Information Processing Systems 7: Proc. of NIPS'94, pages 231-238. The MIT Press, 1995.
- Y. LeCun, L.D. Jackel, L. Bottou, C. Cortes, J.S. Denker, H. Drucker, I.Guyon, U.A. Müller, E. Säckinger, P. Simard, and V. Vapnik. Learning algorithms for classification: A comparism on handwritten digit recognistion. Neural Networks, pages 261-276, 1995.
- L. Bottou, C. Cortes, J.S. Denker, H. Drucker, I. Guyon, L.D. Jackel, Y. LeCun, U.A. Müller, E. Säckinger, P. Simard, and V. Vapnik. Comparison of classifier methods: a case study in handwritten digit recognition. In Proc. of the 12th International Conference Pattern Recognition and Neural Networks, pages 77 -- 87. IEEE Computer Society Press, 1994. (PDF)
- H. Drucker, C. Cortes, L.D. Jackel, Y. LeCun, and V. Vapnik. Boosting and other ensemble methods. Neural Computation, 6(6):1289-1301, 1994.
- M.P. Perrone. Putting it all together: Methods for combining neural networks. In J.D. Cowan, G. Tesauro, and J. Alspector, editors, Advances in Neural Information Processing Systems 6: Proc. of NIPS'93, pages 1188-1189. Morgan Kaufmann Publishers, Inc., 1994.
- H. Drucker, R. Schapire, and P. Simard. Boosting performance in neural networks. International Journal of Pattern Recognition and Artificial Intelligence, 7:705 -- 719, 1993.
- M.P. Perrone. Improving Regression Estimation: Averaging Methods for Variance Reduction with Extensions to General Convex Measure Optimization. PhD thesis, Brown University, Institute for Brain and Neural Systems; Dr. Leon N Cooper, Thesis Supervisor, May 1993.
- M.P. Perrone and L.N. Cooper. When networks disagree: Ensemble method for neural networks. In R.J. Mammone, editor, Neural Networks for Speech and Image processing. Chapman-Hall, 1993.
- R.E. Schapire. The Design and Analysis of Efficient Learning Algorithms. MIT Press, 1992.
- R.E. Schapire. The strength of weak learnability. Machine Learning, 5(2):197-227, 1990.
- D. Haussler. Decision theoretic generalizations of the PAC model for neural net and other learning applications. Technical Report UCSC-CRL-91-02, University of California, Santa Cruz, 1989. Also in 'Information and Computation', Vol. 100, No.1, September 1992.
- N. Littlestone and M. Warmuth. Relating data compression and learnability. Technical report, University of California at Santa Cruz, USA, June 10 1986. (PDF)
- L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone. Classification and Regression Trees. Wadsworth & Brooks, 1984.
- D. Becker and J. Schürmann. Zur verstärkten Berücksichtigung schlecht erkennbarer Zeichen in der Lernstichprobe. Technical Report DK 681.3:028, AEG-Telefunken, 1972.