Title:
Introductory regression analysis : with computer application for business and economics
Author:
Webster, Allen.
ISBN:
9780415899321
9780415899338
9780203182567
Personal Author:
Publication Information:
New York ; London : Routledge, 2013.
Physical Description:
xvi, 472 p. : col. ill. ; 27 cm.
General Note:
Formerly CIP.
Contents:
Machine generated contents note: ch. 1 A Review of Basic Concepts -- Introduction -- 1.1.The Importance of Making Systematic Decisions -- 1.2.The Process of Statistical Analysis -- Data Collection -- Organizing the Data -- Analyzing the Data -- Interpreting the Results -- Prediction and Forecasting -- 1.3.Our "Arabic" Number System -- 1.4.Some Basic Definitions -- Populations and Samples -- Sampling Error -- Sources of Sampling Error: Sampling Bias and Plain Bad Luck -- A Sampling Distribution -- Types of Variables -- 1.5.Levels of Data Measurement -- Nominal Data -- Ordinal Data -- Interval Data -- Ratio Data -- 1.6.Properties of Good Estimators -- A Good Estimator Is Unbiased -- A Good Estimator Is Efficient -- A Good Estimator Is Consistent -- A Good Estimator Is Sufficient -- 1.7.Other Considerations -- 1.8.Probability Distributions -- 1.9.The Development and Application of Models -- 1.10."In God We Trust---Everybody Else Has to Bring Data" -- Chapter Problems --
Contents note continued: Appendix: Excel Commands and Common Probability Distributions -- The Normal Distribution -- Student's t-Distribution -- The F-Distribution -- The Chi-Square Distribution -- ch. 2 An Introduction to Regression and Correlation Analysis -- Introduction -- 2.1.The Simple Regression Model -- 2.2.Estimating the Model: Ordinary Least Squares -- Multiple Regression: A Look Ahead -- Calculating the Residuals -- 2.3.Why the Process Is Called Ordinary Least Squares -- 2.4.Properties and Assumptions of the OLS Model -- 2.5.The Gauss--Markov Theorem -- 2.6.Measures of Goodness of Fit -- The Standard Error of the Estimate -- The Coefficient of Determination -- How r2 Can Be Used as a Measure of Goodness of Fit -- 2.7.Limitations of Regression and Correlation -- 2.8.Regression Through the Origin -- 2.9.Computer Applications -- Using Excel -- Using Minitab -- Using SPSS -- 2.10.Review Problems -- Chapter Problems -- Conceptual Problems -- Computational Problems --
Contents note continued: Computer Problems -- ch. 3 Statistical Inferences in the Simple Regression Model -- Introduction -- 3.1.Confidence Interval Estimation -- Conditional Mean Interval -- The Predictive Interval -- Factors that Affect the Width of the Interval -- Confidence Interval for the Population Regression Coefficient, β1 -- Confidence Interval for the Correlation Coefficient, ρ -- 3.2.Hypothesis Testing: Checking for Statistical Significance -- Hypothesis Test for the Population Regression Coefficient, β1 -- The Meaning of the "Level of Significance" -- The Hypothesis Test for the Population Correlation Coefficient, ρ -- 3.3.Large Samples and the Standard Normal Distribution -- 3.4.The p-Value and Its Role in Inferential Analysis -- How to Detect and Interpret an Extremely Small p-Value -- 3.5.Computer Applications -- Using Excel -- Using Minitab -- Using SPSS -- 3.6.Review Problem -- Chapter Problems -- Conceptual Problems --
Contents note continued: Computational Problems -- Computer Problems -- ch. 4 Multiple Regression: Using Two or More Predictor Variables -- Introduction -- Additional Assumptions -- 4.1.The Multiple Regression Model -- The Adjusted Coefficient of Determination -- Analyzing the Model -- A Change in the Coefficient for GDP -- 4.2.The Issue of Multicollinearity -- The Problems of Multicollinearity -- Detecting Multicollinearity -- Treating the Problem of Multicollinearity -- 4.3.Analysis of Variance: Using the F-Test for Significance -- 4.4.Dummy Variables -- Allowing for More Responses in a Qualitative Variable -- Using Dummy Variables to Deseasonalize Time Series Data -- Interpreting a Computer's Printout -- 4.5.Interaction Between Independent Variables -- 4.6.Incorporating Slope Dummies -- 4.7.Control Variables -- 4.8.A Partial F-Test -- 4.9.Computer Applications -- Excel -- Minitab -- SPSS -- 4.10.Review Problem -- Chapter Problems -- Conceptual Problems --
Contents note continued: Computational Problems -- Computer Problem -- ch. 5 Residual Analysis and Model Specification -- Introduction -- 5.1.Using Residuals to Evaluate the Model -- A Test for Randomness -- Testing the Assumption of a Constant Variance -- The Presence of Autocorrelation -- Checking for Linearity -- Test for Normality -- 5.2.Standardized Regression Coefficients -- 5.3.Proper Model Specification: Getting it Right -- Consequences of an Omitted Variable -- All Combinations -- Backward Elimination, Forward Selection, and Stepwise Regression -- 5.4.Rescaling the Variables -- Rescaling the Dependent Variable -- Rescaling an Independent Variable -- 5.5.The Lagrange Multiplier Test for Significant Variables -- Chapter Problems -- Conceptual Problems -- Computer Problems -- ch. 6 Using Qualitative and Limited Dependent Variables -- Introduction -- 6.1.Logit Analysis -- An Example of a Logit Model -- The Log-Likelihood Statistic -- Classification Tables --
Contents note continued: Maximum Likelihood and the Use of Iterations -- 6.2.The Linear Probability Model and Weighted Least Squares -- The Linear Probability Model -- Weighted Least Squares -- 6.3.Discriminant Analysis -- Evaluation -- Cross-Validation -- The Eigenvalue and Wilks' Lambda -- Chapter Problems -- Conceptual Problems -- Computational Problems -- Computer Problem -- ch. 7 Heteroscedasticity -- Introduction -- 7.1.Consequences of Heteroscedasticity -- 7.2.Detecting Heteroscedasticity -- Using Plots of Residuals -- The Park Test -- The Glejser Test -- White's Test -- The Goldfeld--Quandt Test -- 7.3.Remedial Measures -- If the Population Error Variances Are Known -- Weighted Least Squares with σ2i Unknown -- Applying WLS to Our Income/Consumption Data -- Heteroscedasticity, Elasticities, and the Use of Logs -- Elasticity of Demand and Total Sales Revenue -- How Elasticity Relates to Heteroscedasticity -- How Logs Estimate Elasticities -- Chapter Problems --
Contents note continued: Conceptual Problems -- Computational Problems -- Appendix: Logs and Elasticity -- ch. 8 Autocorrelation -- Introduction -- 8.1.The Nature of Autocorrelation -- 8.2.Causes of Autocorrelation -- Model Misspecification -- The Issue of Stickiness -- 8.3.The Consequences of Autocorrelation -- 8.4.Detecting Autocorrelation -- The Durbin--Watson Statistic -- The Durbin--Watson h-Statistic -- A Simple Hypothesis Test -- A Nonparametric Runs Test -- The Lagrangian Multiplier Test -- The Breusch-Godfrey Test -- 8.5.Correcting for Autocorrelation -- Generalized Least Squares---The Cochrane--Orcutt Method -- Modification of the Cochrane--Orcutt Method -- Incorporating a Lagged Value of the Dependent Variable -- First-Differencing -- Summary -- Chapter Problems -- Conceptual Problems -- Computational Problems -- Appendix: Transforming Data to Eliminate Autocorrelation -- ch. 9 Non-Linear Regression and the Selection of the Proper Functional Form -- Introduction --
Contents note continued: 9.1.The Nature of Curvilinear Models -- 9.2.Polynomials -- 9.3.Quadratics and Cubics -- The Average Cost Curve -- The Revenue Function -- Solving Quadratic Equations and the Vertex -- Cubic Functions -- 9.4.The Use of Logarithmic Transformations: The Double-Log Model -- The Demand Curve as an Example -- The Cobb-Douglas Production Function -- 9.5.Other Logarithmic Transformations -- The Log-Linear Model -- The Linear-Log Model -- The Reciprocal Model -- An Exponential Model -- Continuous Growth Models -- Mixed Models -- Chapter Problems -- Conceptual Problems -- Computational Problems -- Appendix: With Respect to Exponential Functions -- ch. 10 Simultaneous Equations: Two-Stage Least Squares -- Introduction -- 10.1.The Two-Equation Model -- 10.2.Simultaneity Bias -- 10.3.The Reduced-Form Equations -- 10.4.The Identification Problem -- Finding the Proxy -- 10.5.An Illustration of 2SLS -- 10.6.Applying 2SLS to Our Market for Bread --
Contents note continued: 10.7.A Comparison of 2SLS and OLS -- 10.8.The Durbin--Wu--Hausman Test for Simultaneity -- 10.9.A Macroeconomic Model -- Chapter Problems -- Conceptual Problems -- Computational Problems -- ch. 11 Forecasting with Time Series Data and Distributed Lag Models -- Introduction -- 11.1.A Simple Time Series Model -- 11.2.Autoregressive Models -- 11.3.Distributed Lag Models -- The Koyck Transformation (Geometric Lag) -- The Problem of Autocorrelation -- Stationarity and the Dickey--Fuller Test -- Cointegration -- The Almon (Polynomial) Lag -- 11.4.Granger Causality -- 11.5.Methods of Forecasting: Moving Averages and Exponential Smoothing -- Moving Average -- First-Differencing -- Single Exponential Smoothing -- Double Exponential Smoothing -- 11.6.Autoregressive Moving Averages -- The ARMA Model -- Integration---ARIMA -- Box--Jenkins Methodology -- Chapter Problems -- Conceptual Problems -- Computational Problems -- Appendix: The Koyck Transformation --
Contents note continued: APPENDICES -- Appendix A Answers to Selected Even Problems -- Appendix B Statistical Tables -- B.1.Chi-Square Table -- B.2.Durbin--Watson Values -- B.3.F-Distribution Table -- B.4.t-Values for a Two-Tailed Test---Example: t.05,19 = ± 2.0930 -- B.5.The Normal Distribution Table.
Abstract:
This text is designed to help students fully understand regression analysis, its components, and its uses. Taking into consideration current statistical technology, it focuses on the use and interpretation of software, while also demonstrating the logic, reasoning, and calculations that lie behind any statistical analysis. Furthermore, the text emphasizes the application of regression tools to real-life business concerns. This multilayered, yet pragmatic approach fully equips students to derive the benefit and meaning of a regression analysis.