Cart
Free US shipping over $10
Proud to be B-Corp

Regression Analysis Ashish Sen

Regression Analysis By Ashish Sen

Regression Analysis by Ashish Sen


$98.09
Condition - New
Only 2 left

Summary

An up-to-date, rigorous, and lucid treatment of the theory, methods, and applications of regression analysis, and thus ideally suited for those interested in the theory as well as those whose interests lie primarily with applications.

Regression Analysis Summary

Regression Analysis: Theory, Methods, and Applications by Ashish Sen

An up-to-date, rigorous, and lucid treatment of the theory, methods, and applications of regression analysis, and thus ideally suited for those interested in the theory as well as those whose interests lie primarily with applications. It is further enhanced through real-life examples drawn from many disciplines, showing the difficulties typically encountered in the practice of regression analysis. Consequently, this book provides a sound foundation in the theory of this important subject.

Regression Analysis Reviews

I found this to be the most complete and up-to-date regression text I have come across...this text has much to offer.
-Journal of the American Statistical
Association
The material is presented in a lucid and easy-to-understand style...can be ranked as one of the best textbooks on regression in the market.
-mathermatical Reviews
...a successful mix of theory and practice...It will serve nicely to teach both the logic behind regression and the data-analytic use of regression.
-SIAM Review

Table of Contents

1 Introduction.- 1.1 Relationships.- 1.2 Determining Relationships: A Specific Problem.- 1.3 The Model.- 1.4 Least Squares.- 1.5 Another Example and a Special Case.- 1.6 When Is Least Squares a Good Method?.- 1.7 A pleasure of Fit for Simple Regression.- 1.8 Mean and Variance of b0 and b1.- 1.9 Confidence Intervals and Tests.- 1.10 Predictions.- Appendix to Chapter 1.- Problems.- 2 Multiple Regression.- 2.1 Introduction.- 2.2 Regression Model in Matrix Notation.- 2.3 Least Squares Estimates.- 2.4 Examples 31 2..- Gauss-Markov Conditions.- 2.6 Mean and Variance of Estimates Under G-M Conditions.- 2.7 Estimation of ?.- 2.8 Measures of Fit 39?2.- 2.9 The Gauss-Markov Theorem.- 2.10 The Centered Model.- 2.11 Centering and Scaling.- 2.12 *Constrained Least Squares.- Appendix to Chapter 2.- Problems.- 3 Tests and Confidence Regions.- 3.1 Introduction.- 12 Linear Hypothesis.- 3.3 *Likelihood Ratio Test.- 3.4 *Distribution of Test Statistic.- 3.5 Two Special Cases.- 3.6 Examples.- 3.7 Comparison of Repression Equations.- 3.8 Confidence Intervals and Regions.- 3.8.1 C.I. for the Expectation of a Predicted Value.- 3.8.2 C.I for a Future Observation.- 3.8.3 *Confidence Region for Regression Parameters.- 3.8.4 *C.I's for Linear Combinations of Coefficients.- Problem.- 4 Indicator Variables.- 4.1 Introduction.- 4.2 A Simple Application.- 4.3 Polychotomous Variables.- 4.4 Continuous and Indicator Variables.- 4.5 Broken Line Regression.- 4.6 Indicators as Dependent Variables.- Problems.- 5 The Normality Assumption.- 5.1 Introduction.- 5.2 Checking for Normality.- 5.2.1 ProbahilItV Plots.- 5.2.2 Tests for Normalitv.- 5.3 Invoking Large Sample Theory.- 5.4 *Bootstrapping.- 5.5 *Asymptotic Theory.- Problems.- 6 Unequal Variances.- 6.1 Introduction.- 6.2 Detecting Heteroscedasticity.- 6.2.1 Formal Tests.- 6.3 Variance Stabilizing Transformations.- 6.4 Weighing.- Problems.- 7 *Correlated Errors.- 7.1 Introduction.- 7.2 Generalized Least Squares: Case When ? Is Known.- 7.3 Estimated Generalized Least Squares.- 7.3.1 Error Variances Unequal and Unknown.- 7.4 Nested Errors.- 7.5 The Growth Curve Model.- 7.6 Serial Correlation.- 7.6.1 The Durbin-Watson Test.- 7.7 Spatial Correlation.- 7.7. 1 Testing for Spatial Correlation.- 7.7.2 Estimation of Parameters.- Problems.- 8 Outliers and Influential Observations.- 8.1 Introduction.- 8.2 The Leverage.- 8.2.1 *Leverage as Description of Remoteness.- 8.3 The Residuals.- 8.4 Detecting Outliers and Points That Do Not Belong to the Model 157.- 8.5 Influential Observations.- 8.5.1 Other Measures of Influence.- 8.6 Examples.- Appendix to Chapter 8.- Problems.- 9 Transformations.- 9.1 Introduction.- 9.1.1 An Important Word of Warning.- 9.2 Some Common Transformations.- 9.2.1 Polynomial Regression.- 9.2.2 Spline.- 9.2.3 Multiplicative Models.- 9.2.4 The Logit Model for Proportions.- 9.3 Deciding on the Need for Transformations.- 9.3.1 Examining Residual Plots.- 9.3.2 Use of Additional Terms.- 9.3.3 Use of Repeat Measurements.- 9.3.4 Daniel and Wood Near-Neighbor Approach.- 9.3.5 Another Method Based on Near Neighbors.- 9.4 Choosing Transformations.- 9.4.1 Graphical Method: One Independent. Variable.- 9.4.2 Graphical Method: Many Independent Variables.- 9.4.3 Analytic Methods: Transforming the Response.- 9.4.4 Analytic Methods: Transforming the Predictors.- 9.3.5 Simultaneous Power Transformations for Predictors and Response.- Appendix to Chapter 9.- Problems.- 10 Multicollinearity.- 10.1 Introduction.- 10.2 Multicollinearity and Its Effects.- 10.3 Detecting Multicollinearity.- 10.3.1 Tolerances and Variance Inflation Factors.- 10.3.2 Eigenvalues and Condition Numbers.- 10.3.3 Variance Components.- 10.4 Examples.- Problems.- 11 Variable Selection.- 11.1 Introduction.- 11.2 Some Effects of Dropping Variables.- 11.2.1 Effects on Estimates of ssj.- 11.2.2 *Effect on Estimation of Error Variance.- 11.2.3 *Effect on Covariance Matrix of Estimates.- 11.2.4 *Effect on Predicted Values: Mallows' Cp.- 11.3 Variable Selection Procedures.- 11.3.1 Search Over All Possible Subsets.- 11.3.2 Stepwise Procedures.- 11.3.3 Stagewise and Modified Stagewise Procedures.- 11.4 Examples.- Problems.- 12 *Biased Estimation.- 12.1 Introduction 2..- 12.2 Principal Component. Regression.- 12.2.1 Bias and Variance of Estimates.- 12.3 Ridge Regression.- 12.3.1 Physical Interpretations of Ridge Regression.- 12.3.2 Bias and Variance of Estimates.- 12.4 Shrinkage Estimator.- Problems.- A Matrices.- A.1 Addition and Multiplication.- A.2 The Transpose of a Matrix.- A.3 Null and Identity Matrices.- A.4 Vectors.- A.5 Rank of a Matrix.- A.6 Trace of a Matrix.- A.7 Partitioned Matrices.- A.8 Determinants.- A.9 Inverses.- A.10 Characteristic Roots and Vectors.- A.11 Idempotent Matrices.- A.12 The Generalized Inverse.- A.13 Quadratic Forms.- A.14 Vector Spaces.- Problems.- B Random Variables and Random Vectors.- B.1 Random Variables.- B.1.1 Independent. Random Variables.- B.1.2 Correlated Random Variables.- B.1.3 Sample Statistics.- B.1.4 Linear Combinations of Random Variables.- B.2 Random Vectors.- B.3 The Multivariate Normal Distribution.- B.4 The Chi-Square Distributions.- B.5 The F and t Distributions.- B.6 Jacobian of Transformations.- B.7 Multiple Correlation.- Problems.- C Nonlinear Least Squares.- C.1 Gauss-Newton Type Algorithms.- C.1.1 The Gauss-Newton Procedure.- C.1.2 Step Halving.- C.1.3 Starting Values and Derivatives.- C.1.4 Marquardt Procedure.- C.2 Some Other Algorithms.- C.2.1 Steepest Descent Method.- C.2.2 Quasi-Newton Algorithms.- C.2.3 The Simplex Method.- C.2.4 Weighting.- C.3 Pitfalls.- C.4 Bias, Confidence Regions and Measures of Fit.- C.5 Examples.- Problems.- Tables.- References.- Author Index.

Additional information

NLS9781461287896
9781461287896
1461287898
Regression Analysis: Theory, Methods, and Applications by Ashish Sen
New
Paperback
Springer-Verlag New York Inc.
2011-12-23
348
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a new book - be the first to read this copy. With untouched pages and a perfect binding, your brand new copy is ready to be opened for the first time

Customer Reviews - Regression Analysis