The problem describes consequences of violating Ordinary Least Squares (OLS) assumptions such as heteroskedasticity, autocorrelation, multicollinearity, and non-normality on standard errors and coefficients of the regression model. We need to fill in the blanks to complete the statements.
Applied MathematicsStatisticsRegression AnalysisOrdinary Least SquaresHeteroskedasticityAutocorrelationMulticollinearityNon-normalityStandard ErrorsCoefficient Estimates
2025/7/9
1. Problem Description
The problem describes consequences of violating Ordinary Least Squares (OLS) assumptions such as heteroskedasticity, autocorrelation, multicollinearity, and non-normality on standard errors and coefficients of the regression model. We need to fill in the blanks to complete the statements.
2. Solution Steps
* Heteroskedasticity: Heteroskedasticity causes biased standard errors. The coefficient estimates remain unbiased.
* Autocorrelation: Autocorrelation also results in biased standard errors. The coefficient estimates remain unbiased.
* Multicollinearity: Multicollinearity increases the standard errors of the coefficients. This makes it more difficult to obtain statistically significant results. The coefficient estimates themselves remain unbiased.
* Non-normality: Violation of normality affects hypothesis testing using t-tests or F-tests, particularly in small samples. Coefficient estimates are still unbiased and consistent, and the standard errors are still consistent. So the standard errors are not affected. The coefficients are still unbiased.
3. Final Answer
* Heteroskedasticity causes biased standard errors and unbiased coefficients.
* Autocorrelation causes biased standard errors and unbiased coefficients.
* Multicollinearity causes increased standard errors and unbiased coefficients.
* Non-normality causes unaffected standard errors and unbiased coefficients.