000 | 08140nam a22004933i 4500 | ||
---|---|---|---|
001 | EBC6126527 | ||
003 | MiAaPQ | ||
005 | 20240724114134.0 | ||
006 | m o d | | ||
007 | cr cnu|||||||| | ||
008 | 240724s2020 xx o ||||0 eng d | ||
020 |
_a9781800208322 _q(electronic bk.) |
||
020 | _z9781800209046 | ||
035 | _a(MiAaPQ)EBC6126527 | ||
035 | _a(Au-PeEL)EBL6126527 | ||
035 | _a(OCoLC)1143634009 | ||
040 |
_aMiAaPQ _beng _erda _epn _cMiAaPQ _dMiAaPQ |
||
050 | 4 | _aQ325.5 .B384 2020 | |
082 | 0 | _a006.31 | |
100 | 1 | _aBateman, Blaine. | |
245 | 1 | 4 |
_aThe the Supervised Learning Workshop : _bA New, Interactive Approach to Understanding Supervised Learning Algorithms, 2nd Edition. |
250 | _a2nd ed. | ||
264 | 1 |
_aBirmingham : _bPackt Publishing, Limited, _c2020. |
|
264 | 4 | _c©2020. | |
300 | _a1 online resource (531 pages) | ||
336 |
_atext _btxt _2rdacontent |
||
337 |
_acomputer _bc _2rdamedia |
||
338 |
_aonline resource _bcr _2rdacarrier |
||
505 | 0 | _aCover -- FM -- Copyright -- Table of Contents -- Preface -- Chapter 1: Fundamentals -- Introduction -- When to Use Supervised Learning -- Python Packages and Modules -- Loading Data in Pandas -- Exercise 1.01: Loading and Summarizing the Titanic Dataset -- Exercise 1.02: Indexing and Selecting Data -- Exercise 1.03: Advanced Indexing and Selection -- Pandas Methods -- Exercise 1.04: Using the Aggregate Method -- Quantiles -- Lambda Functions -- Exercise 1.05: Creating Lambda Functions -- Data Quality Considerations -- Managing Missing Data -- Class Imbalance -- Low Sample Size -- Activity 1.01: Implementing Pandas Functions -- Summary -- Chapter 2: Exploratory Data Analysis and Visualization -- Introduction -- Exploratory Data Analysis (EDA) -- Summary Statistics and Central Values -- Exercise 2.01: Summarizing the Statistics of Our Dataset -- Missing Values -- Finding Missing Values -- Exercise 2.02: Visualizing Missing Values -- Imputation Strategies for Missing Values -- Exercise 2.03: Performing Imputation Using Pandas -- Exercise 2.04: Performing Imputation Using Scikit-Learn -- Exercise 2.05: Performing Imputation Using Inferred Values -- Activity 2.01: Summary Statistics and Missing Values -- Distribution of Values -- Target Variable -- Exercise 2.06: Plotting a Bar Chart -- Categorical Data -- Exercise 2.07: Identifying Data Types for Categorical Variables -- Exercise 2.08: Calculating Category Value Counts -- Exercise 2.09: Plotting a Pie Chart -- Continuous Data -- Skewness -- Kurtosis -- Exercise 2.10: Plotting a Histogram -- Exercise 2.11: Computing Skew and Kurtosis -- Activity 2.02: Representing the Distribution of Values Visually -- Relationships within the Data -- Relationship between Two Continuous Variables -- Pearson's Coefficient of Correlation -- Exercise 2.12: Plotting a Scatter Plot. | |
505 | 8 | _aExercise 2.13: Plotting a Correlation Heatmap -- Using Pairplots -- Exercise 2.14: Implementing a Pairplot -- Relationship between a Continuous and a Categorical Variable -- Exercise 2.15: Plotting a Bar Chart -- Exercise 2.16: Visualizing a Box Plot -- Relationship Between Two Categorical Variables -- Exercise 2.17: Plotting a Stacked Bar Chart -- Activity 2.03: Relationships within the Data -- Summary -- Chapter 3: Linear Regression -- Introduction -- Regression and Classification Problems -- The Machine Learning Workflow -- Business Understanding -- Data Understanding -- Data Preparation -- Modeling -- Evaluation -- Deployment -- Exercise 3.01: Plotting Data with a Moving Average -- Activity 3.01: Plotting Data with a Moving Average -- Linear Regression -- Least Squares Method -- The Scikit-Learn Model API -- Exercise 3.02: Fitting a Linear Model Using the Least Squares Method -- Activity 3.02: Linear Regression Using the Least Squares Method -- Linear Regression with Categorical Variables -- Exercise 3.03: Introducing Dummy Variables -- Activity 3.03: Dummy Variables -- Polynomial Models with Linear Regression -- Exercise 3.04: Polynomial Models with Linear Regression -- Activity 3.04: Feature Engineering with Linear Regression -- Generic Model Training -- Gradient Descent -- Exercise 3.05: Linear Regression with Gradient Descent -- Exercise 3.06: Optimizing Gradient Descent -- Activity 3.05: Gradient Descent -- Multiple Linear Regression -- Exercise 3.07: Multiple Linear Regression -- Summary -- Chapter 4: Autoregression -- Introduction -- Autoregression Models -- Exercise 4.01: Creating an Autoregression Model -- Activity 4.01: Autoregression Model Based on Periodic Data -- Summary -- Chapter 5: Classification Techniques -- Introduction -- Ordinary Least Squares as a Classifier -- Exercise 5.01: Ordinary Least Squares as a Classifier. | |
505 | 8 | _aLogistic Regression -- Exercise 5.02: Logistic Regression as a Classifier - Binary Classifier -- Exercise 5.03: Logistic Regression - Multiclass Classifier -- Activity 5.01: Ordinary Least Squares Classifier - Binary Classifier -- Select K Best Feature Selection -- Exercise 5.04: Breast Cancer Diagnosis Classification Using Logistic Regression -- Classification Using K-Nearest Neighbors -- Exercise 5.05: KNN Classification -- Exercise 5.06: Visualizing KNN Boundaries -- Activity 5.02: KNN Multiclass Classifier -- Classification Using Decision Trees -- Exercise 5.07: ID3 Classification -- Classification and Regression Tree -- Exercise 5.08: Breast Cancer Diagnosis Classification Using a CART Decision Tree -- Activity 5.03: Binary Classification Using a CART Decision Tree -- Artificial Neural Networks -- Exercise 5.09: Neural Networks - Multiclass Classifier -- Activity 5.04: Breast Cancer Diagnosis Classification Using Artificial Neural Networks -- Summary -- Chapter 6: Ensemble Modeling -- Introduction -- One-Hot Encoding -- Exercise 6.01: Importing Modules and Preparing the Dataset -- Overfitting and Underfitting -- Underfitting -- Overfitting -- Overcoming the Problem of Underfitting and Overfitting -- Bagging -- Bootstrapping -- Exercise 6.02: Using the Bagging Classifier -- Random Forest -- Exercise 6.03: Building the Ensemble Model Using Random Forest -- Boosting -- Adaptive Boosting -- Exercise 6.04: Implementing Adaptive Boosting -- Gradient Boosting -- Exercise 6.05: Implementing GradientBoostingClassifier to Build an Ensemble Model -- Stacking -- Exercise 6.06: Building a Stacked Model -- Activity 6.01: Stacking with Standalone and Ensemble Algorithms -- Summary -- Chapter 7: Model Evaluation -- Introduction -- Importing the Modules and Preparing Our Dataset -- Evaluation Metrics -- Regression Metrics. | |
505 | 8 | _aExercise 7.01: Calculating Regression Metrics -- Classification Metrics -- Numerical Metrics -- Curve Plots -- Exercise 7.02: Calculating Classification Metrics -- Splitting a Dataset -- Hold-Out Data -- K-Fold Cross-Validation -- Sampling -- Exercise 7.03: Performing K-Fold Cross-Validation with Stratified Sampling -- Performance Improvement Tactics -- Variation in Train and Test Errors -- Learning Curve -- Validation Curve -- Hyperparameter Tuning -- Exercise 7.04: Hyperparameter Tuning with Random Search -- Feature Importance -- Exercise 7.05: Feature Importance Using Random Forest -- Activity 7.01: Final Test Project -- Summary -- Appendix -- Index. | |
520 | _aCut through the noise and get real results with a step-by-step approach to understanding supervised learning algorithms. | ||
588 | _aDescription based on publisher supplied metadata and other sources. | ||
590 | _aElectronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries. | ||
650 | 0 | _aMachine learning. | |
655 | 4 | _aElectronic books. | |
700 | 1 | _aJha, Ashish Ranjan. | |
700 | 1 | _aJohnston, Benjamin. | |
700 | 1 | _aMathur, Ishita. | |
776 | 0 | 8 |
_iPrint version: _aBateman, Blaine _tThe the Supervised Learning Workshop _dBirmingham : Packt Publishing, Limited,c2020 _z9781800209046 |
797 | 2 | _aProQuest (Firm) | |
856 | 4 | 0 |
_uhttps://ebookcentral.proquest.com/lib/orpp/detail.action?docID=6126527 _zClick to View |
999 |
_c16460 _d16460 |