ORPP logo
Image from Google Jackets

Applied Regression Analysis.

By: Material type: TextTextSeries: New York Academy of Sciences SeriesPublisher: Newark : John Wiley & Sons, Incorporated, 1998Copyright date: ©1998Edition: 1st edDescription: 1 online resource (738 pages)Content type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9781118625620
Subject(s): Genre/Form: Additional physical formats: Print version:: Applied Regression AnalysisDDC classification:
  • 519.5/36
LOC classification:
  • QA278.2 .D7 1998
Online resources:
Contents:
Cover -- Title Page -- Copyright -- Contents -- Preface to the Third Edition -- About the Software -- Chapter 0: Basic Prerequisite Knowledge -- 0.1. Distributions : Normal, t, and F -- Normal Distribution -- Gamma Function -- t-distribution -- F-distribution -- 0.2. Confidence Intervals (or Bands) and T-tests -- 0.3. Elements of Matrix Algebra -- Matrix, Vector, Scalar -- Equality -- Sum and Difference -- Transpose -- Symmetry -- Multiplication -- Special Matrices and Vectors -- Orthogonality -- Inverse Matrix -- Obtaining an Inverse -- Determinants -- Common Factors -- Chapter 1: Fitting a Straight Line by Least Squares -- 1.0. Introduction: the Need for Statistical Analysis -- 1.1. Straight Line Relationship Between Two Variables -- 1.2. Linear Regression: Fitting a Straight Line by Least Squares -- Meaning of Linear Model -- Least Squares Estimation -- Pocket-calculator Form -- Calculations for the Steam Data -- Centering the Data -- 1.3. The Analysis of Variance -- Sums of Squares -- Degrees of Freedom (df) -- Analysis of Variance Table -- Steam Data Calculations -- Skeleton Analysis of Variance Ta Ble -- R2 Statistic -- 1.4. Confidence Intervals and Tests for ß0 and ß1 -- Standard Deviation of the Slope B1 -- Confidence Interval for ß1 -- Confidence Interval for ß1 -- Test for Ho: ß1 = ß10 Versus H1: ß1 ≠ ß10 -- Reject or Do Not Reject -- Confidence Interval Represents a Set of Tests -- Standard Deviation of the Intercept -- Confidence Interval for ß0 -- 1.5. F-test for Significance of Regression -- P-values for F-statistics -- F = T2 -- P-values for T-statistics -- 1.6. the Correlation Between X and Y -- Correlation and Regression -- Rxy and R Connections -- Testing a Single Correlation -- 1.7. Summary of the Straight Line Fit Computations -- Pocket-calculator Computations -- 1.8. Historical Remarks -- Appendix 1 A. Steam Plant Data.
Exercises -- Chapter 2: Checking the Straight Line Fit -- 2.1. Lack of Fit and Pure Error -- General Discussion of Variance and Bias -- How Big Is σ2? -- Genuine Repeats Are Needed -- Calculation of Pure Error and Lack of Fit Mean Squares -- Special Formula When Nj = 2 -- Split of the Residual ss -- Effect of Repeat Runs on R2 -- Looking at the Data and Fitted Model -- Pure Error in the Many Predictors Case -- Adding (or Dropping) X's Can Affect Maximum R2 -- Approximate Repeats -- Generic Pure Error Situations Illustrated Via Straight Line Fits -- 2.2. Testing Homogeneity of Pure Error -- Bartlett's Test -- Bartlett's Test Modified for Kurtosis -- Levene's Test Using Means -- Levene's Test Using Medians -- Some Cautionary Remarks -- A Second Example -- 2.3. Examining Residuals: the Basic Plots -- How Should the Residuals Behave? -- 2.4. Non-normality Checks on Residuals -- Normal Plot of Residuals -- 2.5. Checks for Time Effects, Nonconstant Variance, Need for Transformation, and Curvature -- Three Questions and Answers -- Comment -- 2.6. Other Residuals Plots -- Dependencies Between Residuals -- 2.7. Durbin-watson Test -- 2.8. Reference Books for Analysis of Residuals -- Appendix 2a. Normal Plots -- Normal Scores -- Outliers -- Some General Characteristics of Normal Plots -- Making Your Own Probability Paper -- Appendix 2b. Minitab Instructions -- Exercises -- Chapter 3: Fitting Straight Lines: Special Topics -- 3.0. Summary and Preliminaries -- Covariance of Two Linear Functions -- 3.1. Standard Error of Y -- Intervals for Individual Observations and Means of q Observations -- 3.2. Inverse Regression (straight Line Case) -- 3.3. Some Practical Design of Experiment Implications of Regression -- Experimental Strategy Decisions -- An Example -- Comments on Table 3.1 -- 3.4. Straight Line Regression When Both Variables Are Subject to Error1.
Practical Advice -- Geometric Mean Functional Relationship -- References -- Exercises for Chapters 1-3 -- Chapter 4: Regression in Matrix Terms: Straight Line Case -- Matrices -- 4.1. Fitting a Straight Line in Matrix Terms -- Manipulating Matrices -- Orthogonality -- The Model in Matrix Form -- Setup for a Quadratic Model -- Transpose -- Inverse of a Matrix -- Inverses of Small Matrices -- Matrix Symmetry for Square Matrices -- Diagonal Matrices -- Inverting Partitioned Matrices with Blocks of Zeros -- Less Obvious Partitioning -- Back to the Straight Line Case -- Solving the Normal Equations -- A Small Sermon on Rounding Errors -- Section Summary -- 4.2. Singularity: What Happens in Regression to Make X'x Singular? an Example -- Singularity in the General Linear Regression Context -- 4.3. The Analysis of Variance in Matrix Terms -- 4.4. The Variances and Covariance of B0 and B1 from the Matrix Calculation -- Correlation Between B0 and B1 -- 4.5. Variance of Y Using the Matrix Development -- 4.6. Summary of Matrix Approach to Fitting a Straight Line (nonsingular Case) -- 4.7. The General Regression Situation -- Exercises for Chapter 4 -- Chapter 5: the General Regression Situation -- 5.1. General Linear Regression -- A Justification for Using Least Squares -- 5.2. Least Squares Properties -- The R2 Statistic -- R2 Can Be Deceptive -- Adjusted R2 Statistic -- 5.3. Least Squares Properties When E ~ N(0, 1σ2) -- Just Significant Regressions May Not Predict Well -- The Distribution of R2 -- Properties, Continued -- Bonferroni Limits -- 5.4. Confidence Intervals Versus Regions -- Moral -- 5.5. More on Confidence Intervals Versus Regions -- When F-test and T-tests Conflict -- References -- Appendix 5a. Selected Useful Matrix Results -- Exercises -- Chapter 6: Extra Sums of Squares and Tests for Several Parameters Being Zero.
6.1. The "extra Sum of Squares" Principle -- Polynomial Models -- Other Points -- Two Alternative Forms of the Extra Ss -- Sequential Sums of Squares -- Special Problems with Polynomial Models -- Partial Sums of Squares -- When T = F1/2 -- 6.2. Two Predictor Variables: Example -- How Useful Is the Fitted Equation? -- What Has Been Accomplished by the Addition of a Second Predictor Variable (namely, X6)? -- The Standard Error S -- Extra Ss F-test Criterion -- Standard Error of bi -- Correlations Between Parameter Estimates -- Confidence Limits for the True Mean Value of Y, Given a Specific Set of Xs -- Confidence Limits for the Mean of 9 Observations Given a Specific Set of X's -- Examining the Residuals -- 6.3. Sum of Squares of a Set of Linear Functions of Y's -- Appendix 6a. Orthogonal Columns in the X Matrix -- Appendix 68. Two Predictors: Sequential Sums of Squares -- References -- Exercises for Chapters 5 and 6 -- Chapter 7: Serial Correlation in the Residuals and the Durbin-watson Test -- 7.1. Serial Correlation in Residuals -- 7.2. The Durbin-watson Test for a Certain Type of Serial Correlation -- Primary Test, Tables of Dl and Du -- A Simplified Test -- Width of the Primary Test Inconclusive Region -- Mean Square Successive Difference -- 7.3. Examining Runs in the Time Sequence Plot of Residuals: Runs Test -- Runs -- Tables for Modest n1 and n2 -- Larger n1 and n2 Values -- Comments -- References -- Exercises for Chapter 7 -- Chapter 8: More on Checking Fitted Models -- 8.1. The Hat Matrix H and the Various Types of Residuals -- Variance-covariance Matrix of e -- Other Facts About H -- Internally Studentized Residuals1 -- Extra Sum of Squares Attributable to ej -- Externally Studentized Residuals2 -- Other Comments -- 8.2. Added Variable Plot and Partial Residuals -- Added Variable Plot -- Partial Residuals.
8.3. Detection of Influential Observations: Cook's Statistics -- Higher-order Cook's Statistics -- Another Worked Example -- Plots -- 8.4. Other Statistics Measuring Influence -- The Dffits Statistics -- Atkinson's Modified Cook's Statistics -- 8.5. Reference Books for Analysis of Residuals -- Exercises for Chapter 8 -- Chapter 9: Multiple Regression: Special Topics -- 9.1. Testing a General Linear Hypothesis -- Testing a General Linear Hypothesis Cß = 0 -- 9.2. Generalized Least Squares and Weighted Least Squares -- Generalized Least Squares Residuals -- General Comments -- Application to Serially Correlated Data -- 9.3. an Example of Weighted Least Squares -- 9.4 a Numerical Example of Weighted Least Squares -- 9.5 Restricted Least Squares -- 9.6. Inverse Regression (multiple Predictor Case) -- 9.7. Planar Regression When All the Variables Are Subject to Error -- Appendix 9a. Lagrange's Undetermined Multipliers -- Notation -- Basic Method -- Is the Solution a Maximum or Minimum? -- Exercises for Chapter 9 -- Chapter 10: Bias in Regression Estimates, and Expected Values of Mean Squares and Sums of Squares -- 10.1. Bias in Regression Estimates -- 10.2. The Effect of Bias on the Least Squares Analysis of Variance -- 10.3. Finding the Expected Values of Mean Squares -- 10.4. Expected Value of Extra Sum of Squares -- Exercises for Chapter 10 -- Chapter 11: on Worthwhile Regressions, Big F's, and R2 -- 11.1. Is My Regression a Useful One? -- An Alternative and Simpler Check -- Proof of (11.1.3) -- Comment -- 11.2. a Conversation About R2 -- What Should One Do for Linear Regression? -- References -- Appendix 11a. How Significant Should My Regression Be? -- The γm Criterion -- Exercises for Chapter 11 -- Chapter 12: Models Containing Functions of the Predictors, Including Polynomial Models -- 12.1. More Complicated Model Functions.
Polynomial Models of Various Orders in the Xj.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
No physical items for this record

Cover -- Title Page -- Copyright -- Contents -- Preface to the Third Edition -- About the Software -- Chapter 0: Basic Prerequisite Knowledge -- 0.1. Distributions : Normal, t, and F -- Normal Distribution -- Gamma Function -- t-distribution -- F-distribution -- 0.2. Confidence Intervals (or Bands) and T-tests -- 0.3. Elements of Matrix Algebra -- Matrix, Vector, Scalar -- Equality -- Sum and Difference -- Transpose -- Symmetry -- Multiplication -- Special Matrices and Vectors -- Orthogonality -- Inverse Matrix -- Obtaining an Inverse -- Determinants -- Common Factors -- Chapter 1: Fitting a Straight Line by Least Squares -- 1.0. Introduction: the Need for Statistical Analysis -- 1.1. Straight Line Relationship Between Two Variables -- 1.2. Linear Regression: Fitting a Straight Line by Least Squares -- Meaning of Linear Model -- Least Squares Estimation -- Pocket-calculator Form -- Calculations for the Steam Data -- Centering the Data -- 1.3. The Analysis of Variance -- Sums of Squares -- Degrees of Freedom (df) -- Analysis of Variance Table -- Steam Data Calculations -- Skeleton Analysis of Variance Ta Ble -- R2 Statistic -- 1.4. Confidence Intervals and Tests for ß0 and ß1 -- Standard Deviation of the Slope B1 -- Confidence Interval for ß1 -- Confidence Interval for ß1 -- Test for Ho: ß1 = ß10 Versus H1: ß1 ≠ ß10 -- Reject or Do Not Reject -- Confidence Interval Represents a Set of Tests -- Standard Deviation of the Intercept -- Confidence Interval for ß0 -- 1.5. F-test for Significance of Regression -- P-values for F-statistics -- F = T2 -- P-values for T-statistics -- 1.6. the Correlation Between X and Y -- Correlation and Regression -- Rxy and R Connections -- Testing a Single Correlation -- 1.7. Summary of the Straight Line Fit Computations -- Pocket-calculator Computations -- 1.8. Historical Remarks -- Appendix 1 A. Steam Plant Data.

Exercises -- Chapter 2: Checking the Straight Line Fit -- 2.1. Lack of Fit and Pure Error -- General Discussion of Variance and Bias -- How Big Is σ2? -- Genuine Repeats Are Needed -- Calculation of Pure Error and Lack of Fit Mean Squares -- Special Formula When Nj = 2 -- Split of the Residual ss -- Effect of Repeat Runs on R2 -- Looking at the Data and Fitted Model -- Pure Error in the Many Predictors Case -- Adding (or Dropping) X's Can Affect Maximum R2 -- Approximate Repeats -- Generic Pure Error Situations Illustrated Via Straight Line Fits -- 2.2. Testing Homogeneity of Pure Error -- Bartlett's Test -- Bartlett's Test Modified for Kurtosis -- Levene's Test Using Means -- Levene's Test Using Medians -- Some Cautionary Remarks -- A Second Example -- 2.3. Examining Residuals: the Basic Plots -- How Should the Residuals Behave? -- 2.4. Non-normality Checks on Residuals -- Normal Plot of Residuals -- 2.5. Checks for Time Effects, Nonconstant Variance, Need for Transformation, and Curvature -- Three Questions and Answers -- Comment -- 2.6. Other Residuals Plots -- Dependencies Between Residuals -- 2.7. Durbin-watson Test -- 2.8. Reference Books for Analysis of Residuals -- Appendix 2a. Normal Plots -- Normal Scores -- Outliers -- Some General Characteristics of Normal Plots -- Making Your Own Probability Paper -- Appendix 2b. Minitab Instructions -- Exercises -- Chapter 3: Fitting Straight Lines: Special Topics -- 3.0. Summary and Preliminaries -- Covariance of Two Linear Functions -- 3.1. Standard Error of Y -- Intervals for Individual Observations and Means of q Observations -- 3.2. Inverse Regression (straight Line Case) -- 3.3. Some Practical Design of Experiment Implications of Regression -- Experimental Strategy Decisions -- An Example -- Comments on Table 3.1 -- 3.4. Straight Line Regression When Both Variables Are Subject to Error1.

Practical Advice -- Geometric Mean Functional Relationship -- References -- Exercises for Chapters 1-3 -- Chapter 4: Regression in Matrix Terms: Straight Line Case -- Matrices -- 4.1. Fitting a Straight Line in Matrix Terms -- Manipulating Matrices -- Orthogonality -- The Model in Matrix Form -- Setup for a Quadratic Model -- Transpose -- Inverse of a Matrix -- Inverses of Small Matrices -- Matrix Symmetry for Square Matrices -- Diagonal Matrices -- Inverting Partitioned Matrices with Blocks of Zeros -- Less Obvious Partitioning -- Back to the Straight Line Case -- Solving the Normal Equations -- A Small Sermon on Rounding Errors -- Section Summary -- 4.2. Singularity: What Happens in Regression to Make X'x Singular? an Example -- Singularity in the General Linear Regression Context -- 4.3. The Analysis of Variance in Matrix Terms -- 4.4. The Variances and Covariance of B0 and B1 from the Matrix Calculation -- Correlation Between B0 and B1 -- 4.5. Variance of Y Using the Matrix Development -- 4.6. Summary of Matrix Approach to Fitting a Straight Line (nonsingular Case) -- 4.7. The General Regression Situation -- Exercises for Chapter 4 -- Chapter 5: the General Regression Situation -- 5.1. General Linear Regression -- A Justification for Using Least Squares -- 5.2. Least Squares Properties -- The R2 Statistic -- R2 Can Be Deceptive -- Adjusted R2 Statistic -- 5.3. Least Squares Properties When E ~ N(0, 1σ2) -- Just Significant Regressions May Not Predict Well -- The Distribution of R2 -- Properties, Continued -- Bonferroni Limits -- 5.4. Confidence Intervals Versus Regions -- Moral -- 5.5. More on Confidence Intervals Versus Regions -- When F-test and T-tests Conflict -- References -- Appendix 5a. Selected Useful Matrix Results -- Exercises -- Chapter 6: Extra Sums of Squares and Tests for Several Parameters Being Zero.

6.1. The "extra Sum of Squares" Principle -- Polynomial Models -- Other Points -- Two Alternative Forms of the Extra Ss -- Sequential Sums of Squares -- Special Problems with Polynomial Models -- Partial Sums of Squares -- When T = F1/2 -- 6.2. Two Predictor Variables: Example -- How Useful Is the Fitted Equation? -- What Has Been Accomplished by the Addition of a Second Predictor Variable (namely, X6)? -- The Standard Error S -- Extra Ss F-test Criterion -- Standard Error of bi -- Correlations Between Parameter Estimates -- Confidence Limits for the True Mean Value of Y, Given a Specific Set of Xs -- Confidence Limits for the Mean of 9 Observations Given a Specific Set of X's -- Examining the Residuals -- 6.3. Sum of Squares of a Set of Linear Functions of Y's -- Appendix 6a. Orthogonal Columns in the X Matrix -- Appendix 68. Two Predictors: Sequential Sums of Squares -- References -- Exercises for Chapters 5 and 6 -- Chapter 7: Serial Correlation in the Residuals and the Durbin-watson Test -- 7.1. Serial Correlation in Residuals -- 7.2. The Durbin-watson Test for a Certain Type of Serial Correlation -- Primary Test, Tables of Dl and Du -- A Simplified Test -- Width of the Primary Test Inconclusive Region -- Mean Square Successive Difference -- 7.3. Examining Runs in the Time Sequence Plot of Residuals: Runs Test -- Runs -- Tables for Modest n1 and n2 -- Larger n1 and n2 Values -- Comments -- References -- Exercises for Chapter 7 -- Chapter 8: More on Checking Fitted Models -- 8.1. The Hat Matrix H and the Various Types of Residuals -- Variance-covariance Matrix of e -- Other Facts About H -- Internally Studentized Residuals1 -- Extra Sum of Squares Attributable to ej -- Externally Studentized Residuals2 -- Other Comments -- 8.2. Added Variable Plot and Partial Residuals -- Added Variable Plot -- Partial Residuals.

8.3. Detection of Influential Observations: Cook's Statistics -- Higher-order Cook's Statistics -- Another Worked Example -- Plots -- 8.4. Other Statistics Measuring Influence -- The Dffits Statistics -- Atkinson's Modified Cook's Statistics -- 8.5. Reference Books for Analysis of Residuals -- Exercises for Chapter 8 -- Chapter 9: Multiple Regression: Special Topics -- 9.1. Testing a General Linear Hypothesis -- Testing a General Linear Hypothesis Cß = 0 -- 9.2. Generalized Least Squares and Weighted Least Squares -- Generalized Least Squares Residuals -- General Comments -- Application to Serially Correlated Data -- 9.3. an Example of Weighted Least Squares -- 9.4 a Numerical Example of Weighted Least Squares -- 9.5 Restricted Least Squares -- 9.6. Inverse Regression (multiple Predictor Case) -- 9.7. Planar Regression When All the Variables Are Subject to Error -- Appendix 9a. Lagrange's Undetermined Multipliers -- Notation -- Basic Method -- Is the Solution a Maximum or Minimum? -- Exercises for Chapter 9 -- Chapter 10: Bias in Regression Estimates, and Expected Values of Mean Squares and Sums of Squares -- 10.1. Bias in Regression Estimates -- 10.2. The Effect of Bias on the Least Squares Analysis of Variance -- 10.3. Finding the Expected Values of Mean Squares -- 10.4. Expected Value of Extra Sum of Squares -- Exercises for Chapter 10 -- Chapter 11: on Worthwhile Regressions, Big F's, and R2 -- 11.1. Is My Regression a Useful One? -- An Alternative and Simpler Check -- Proof of (11.1.3) -- Comment -- 11.2. a Conversation About R2 -- What Should One Do for Linear Regression? -- References -- Appendix 11a. How Significant Should My Regression Be? -- The γm Criterion -- Exercises for Chapter 11 -- Chapter 12: Models Containing Functions of the Predictors, Including Polynomial Models -- 12.1. More Complicated Model Functions.

Polynomial Models of Various Orders in the Xj.

Description based on publisher supplied metadata and other sources.

Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.

There are no comments on this title.

to post a comment.

© 2024 Resource Centre. All rights reserved.