000 | 02026nam a22002657a 4500 | ||
---|---|---|---|
999 |
_c11848 _d11848 |
||
003 | OSt | ||
005 | 20231010025038.0 | ||
008 | 180313b ||||| |||| 00| 0 eng d | ||
020 | _a9780470641835 | ||
040 |
_beng _cUNIMY |
||
050 |
_aQ325.5 _b.K85 2011 |
||
100 | _aKulkarni, Sanjeev. | ||
245 |
_aAn elementary introduction to statistical learning theory / _cSanjeev Kulkarni, Gilbert Harman. |
||
260 |
_aHoboken, N.J. : _bWiley, _cc2011. |
||
300 |
_axiv, 209 p. : _bill. ; _c24 cm. |
||
490 | _aWiley series in probability and statistics | ||
504 | _aIncludes bibliographical references and indexes. | ||
505 | _aIntroduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting -- Bibliography. | ||
520 | _a"A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference" | ||
650 |
_aMachine learning _xStatistical methods. |
||
650 | _aPattern recognition systems. | ||
700 | _aHarman, Gilbert. | ||
830 | _aWiley series in probability and statistics. | ||
942 |
_2lcc _cBK |