Termín obdržení zásilky
Dodací doba je ovlivněna statním svátkem ( 01.05 )
Česká pošta Čtvrtek 02.05
PPL Čtvrtek 02.05
Osobní odběr Pátek 03.05
Termíny jsou pouze orientační a mohou se lišit podle zvoleného typu platby. O Průběhu zásilky Vás budeme informovat e-mailem.
Při nákupu většího množství produktů negarantujeme dodání do zobrazeného data

Machine Learning

Machine Learning
15 %

1935  Kč 2 282 Kč

Sleva až 70% u třetiny knih
Chapter 1 - A Brief Review on Machine Learning 1.1 Machine Learning definition 1.2 Main types of learning 1.3 Supervised learning 1.4 How a supervised algorithm learns? 1.5 Illustrating the Supervised Learning 1.51. The Perceptron 1.5.2 Multilayer Perceptron 1.6 Concluding Remarks Chapter 2 - Statistical Learning Theory 2.1 Motivation 2.2 Basic concepts 2.2.1 Probability densities and joint probabilities 2.2.2 Identically and independently distributed data 2.2.3 Assumptions considered by the Statistical Learning Theory 2.2.4 Expected risk and generalization 2.2.5 Bounds for generalization with a practical example 2.2.6 Bayes risk and universal consistency 2.2.7 Consistency, overfitting and underfitting 2.2.8 Bias of classification algorithms 2.3 Empirical Risk Minimization Principle 2.3.1 Consistency and the ERM Principle 2.3.2 Restriction of the space of admissible functions 2.3.3 Ensuring uniform convergence in practice 2.4 Symmetrization lemma and the shattering coefficient 2.4.1 Shattering coefficient as a capacity measure 2.4.2 Making the ERM Principle consistent for infinite functions 2.5 Generalization bounds 2.6 The Vapnik-Chervonenkis dimension 2.6.1 Margin bounds 2.7 Concluding Remarks Chapter 3 - Assessing Learning Algorithms 3.1 Mapping the concepts of the Statistical Learning Theory 3.2 Using the Chernoff bound 3.3 Using the Generalization Bound 3.4 Using the SVM Generalization Bound 3.5 Empirical Study of the Biases of Classification Algorithms 3.6 Concluding Remarks Chapter 4 - Introduction to Support Vector Machines 4.1 Using basic Algebra to build a classification algorithm 4.2 Hyperplane-based classification: an intuitive view 4.3 Hyperplane-based classification: an algebraic view 4.3.1 Lagrange multipliers 4.3.2 Karush-Kuhn-Tucker conditions 4.4 Formulating the hard-margin SVM optimization problem 4.5 Formulating the soft-margin SVM optimization problem 4.6 Concluding Remarks Chapter 5 - In Search for the Optimization Algorithm 5.1 What is an optimization problem? 5.2 Main types of optimization problems 5.3 Linear optimization problems 5.3.1 Solving through graphing 5.3.2 Primal and dual forms of linear problems 5.3.2.1 Using the table and rules 5.3.2.2 Graphical interpretation of primal and dual forms 5.3.2.3 Using Lagrange multipliers 5.3.3 Using an algorithmic approach to solve linear problems 5.3.4 On the KKT conditions for linear problems 5.3.4.1 Applying the rules 5.3.4.2 Graphical interpretation of the KKT conditions 5.4 Convex optimization problems 5.4.1 Interior Point Methods 5.4.2 The Primal-Dual Path Following Interior Point Method 5.4.3 Implementing the Interior Point Method to solve our first optimization problem 5.4.4 Implementing the Interior Point Method to solve the SVM optimization problem 5.4.5 Solving the SVM optimization problem using package LowRankQP 5.5 Concluding Remarks Chapter 6 - A Brief Introduction on Kernels 6.1 Definitions, typical kernels and examples 6.1.1 The Polynomial kernel 6.1.2 The Radial Basis Function kernel 6.1.3 The Sigmoidal Kernel 6.1.4 Practical examples with kernels 6.2 Linear Algebra 6.2.1 Basis 6.2.2 Linear transformation 6.2.3 Inverses of Linear Transformations 6.2.4 Dot products 6.2.5 Change of basis and orthonormal basis 6.2.6 Eigenvalues and Eigenvectors
Autor:
Nakladatel: Springer, Berlin
Rok vydání: 2018
Jazyk : Angličtina
Vazba: Hardback
Mohlo by se vám také líbit..