Chapter 14 Simple Linear Regression Analysis

Simple Linear Regression Analysis 14.1 The Simple Linear Regression Model and the Least Square Point Estimates 14.2 Model Assumptions and the Standard Error 14.3 Testing the Significance of Slope and y-Intercept 14.4 Confidence and Prediction Intervals 14.5 Simple Coefficients of Determination and Correlation

ppt14 trang | Chia sẻ: thanhlam12 | Lượt xem: 633 | Lượt tải: 0download
Bạn đang xem nội dung tài liệu Chapter 14 Simple Linear Regression Analysis, để tải tài liệu về máy bạn click vào nút DOWNLOAD ở trên
Simple Linear Regression AnalysisChapter 14Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/IrwinSimple Linear Regression Analysis14.1 The Simple Linear Regression Model and the Least Square Point Estimates14.2 Model Assumptions and the Standard Error14.3 Testing the Significance of Slope and y-Intercept14.4 Confidence and Prediction Intervals14.5 Simple Coefficients of Determination and Correlation14-*Simple Linear Regression Analysis Continued14.6 Testing the Significance of the Population Correlation Coefficient14.7 An F Test for the Model14.8 The QHIC Case14.9 Residual Analysis14.10 Some Shortcut Formulas (Optional)14-*14.1 The Simple Linear Regression Model and the Least Squares Point EstimatesThe dependent (or response) variable is the variable we wish to understand or predictThe independent (or predictor) variable is the variable we will use to understand or predict the dependent variableRegression analysis is a statistical technique that uses observed data to relate the dependent variable to one or more independent variablesThe objective is to build a regression model that can describe, predict and control the dependent variable based on the independent variableLO14-1: Explain the simple linear regressionmodel.14-*Form of The Simple Linear Regression Modely = β0 + β1x + εy = β0 + β1x + ε is the mean value of the dependent variable y when the value of the independent variable is xβ0 is the y-intercept; the mean of y when x is 0β1 is the slope; the change in the mean of y per unit change in xε is an error term that describes the effect on y of all factors other than xLO14-114-*Regression Termsβ0 and β1 are called regression parametersβ0 is the y-intercept and β1 is the slopeWe do not know the true values of these parametersSo, we must use sample data to estimate themb0 is the estimate of β0 and b1 is the estimate of β1LO14-114-*The Least Squares Point EstimatesEstimation/prediction equation ŷ = b0 + b1xLeast squares point estimate of the slope β1 Least squares point estimate of y-intercept 0LO14-2: Find the least squares point estimates of the slope and y-intercept.14-*14.2 Model Assumptions and the Standard ErrorMean of Zero At any given value of x, the population of potential error term values has a mean equal to zeroConstant Variance Assumption At any given value of x, the population of potential error term values has a variance that does not depend on the value of xNormality Assumption At any given value of x, the population of potential error term values has a normal distributionIndependence Assumption Any one value of the error term ε is statistically independent of any other value of εLO14-3: Describe the assumptions behind simple linear regression and calculate the standard error.Figure 14.714-*14.3 Testing the Significance of the Slope and y-InterceptA regression model is not likely to be useful unless there is a significant relationship between x and yTo test significance, we use the null hypothesis: H0: β1 = 0 Versus the alternative hypothesis: Ha: β1 ≠ 0LO14-4: Test the significance of the slope and y-intercept.14-*14.4 Confidence and Prediction IntervalsThe point on the regression line corresponding to a particular value of x0 of the independent variable x is ŷ = b0 + b1x0It is unlikely that this value will equal the mean value of y when x equals x0Therefore, we need to place bounds on how far the predicted value might be from the actual valueWe can do this by calculating a confidence interval mean for the value of y and a prediction interval for an individual value of yLO14-5: Calculate and interpret a confidence interval for a mean value and a prediction interval for an individual value.14-*14.5 Simple Coefficient of Determination and CorrelationHow useful is a particular regression model?One measure of usefulness is the simple coefficient of determinationIt is represented by the symbol r2LO 6: Calculate and interpret the simple coefficients of determination and correlation.This section may be covered anytime after reading Section 14.114-*14.6 Testing the Significance of the Population Correlation CoefficientThe simple correlation coefficient (r) measures the linear relationship between the observed values of x and y from the sampleThe population correlation coefficient (ρ) measures the linear relationship between all possible combinations of observed values of x and yr is an estimate of ρLO14-7: Test hypotheses about the population correlation coefficient.14-*14.7 An F Test for ModelFor simple regression, this is another way to test the null hypothesis H0: β1 = 0 This is the only test we will use for multiple regressionThe F test tests the significance of the overall regression relationship between x and yLO14-8: Test the significance of a simple linear regression modelby using an F test.14-*14.9 Residual AnalysisChecks of regression assumptions are performed by analyzing the regression residualsResiduals (e) are defined as the difference between the observed value of y and the predicted value of y, e = y - ŷ Note that e is the point estimate of εIf regression assumptions valid, the population of potential error terms will be normally distributed with mean zero and variance σ2Different error terms will be statistically independentLO14-9: Use residual analysis to check the assumptions of simple linear regression.14-*
Tài liệu liên quan