본문 바로가기 주메뉴 바로가기
국회도서관 홈으로 정보검색 소장정보 검색

목차보기


Preface
Acknowledgments
Notation
1. Introduction
1.1 What Is Econometrics?
1.2 The Probability Approach to Econometrics
1.3 Econometric Terms
1.4 Observational Data
1.5 Standard Data Structures
1.6 Econometric Software
1.7 Replication
1.8 Data Files for Textbook
1.9 Reading theBook
Part I. Regression
2. Conditional Expectation and Projection
2.1 Introduction
2.2 The Distribution of Wages
2.3 Conditional Expectation
2.4 Logs and Percentages
2.5 Conditional Expectation Function
2.6 Continuous Variables
2.7 Law of Iterated Expectations
2.8 CEF Error
2.9 Intercept-Only Model
2.10 Regression Variance
2.11 Best Predictor
2.12 Conditional Variance
2.13 Homoskedasticity and Heteroskedasticity
2.14 Regression Derivative
2.15 Linear CEF
2.16 Linear CEF with Nonlinear Effects
2.17 Linear CEF with Dummy Variables
2.18 Best Linear Predictor
2.19 Illustrations of Best Linear Predictor
2.20 Linear Predictor Error Variance
2.21 Regression Coefficients
2.22 Regression Subvectors
2.23 Coefficient Decomposition
2.24 Omitted Variable Bias
2.25 Best Linear Approximation
2.26 Regression to the Mean
2.27 Reverse Regression
2.28 Limitations of the Best Linear Projection
2.29 Random Coefficient Model
2.30 Causal Effects
2.31 Existence and Uniqueness of the Conditional Expectation
2.32 Identification
2.33 Technical Proofs
2.34 Exercises
3. The Algebra of Least Squares
3.1 Introduction
3.2 Samples
3.3 Moment Estimators
3.4 Least Squares Estimator
3.5 Solving for Least Squares with One Regressor
3.6 Solving for Least Squares with Multiple Regressors
3.7 Illustration
3.8 Least Squares Residuals
3.9 Demeaned Regressors
3.10 Model in Matrix Notation
3.11 Projection Matrix
3.12 Annihilator Matrix
3.13 Estimation of Error Variance
3.14 Analysis of Variance
3.15 Projections
3.16 Regression Components
3.17 Regression Components (Alternative Derivation)
3.18 Residual Regression
3.19 Leverage Values
3.20 Leave-One-Out Regression
3.21 Influential Observations
3.22 CPS Dataset
3.23 Numerical Computation
3.24 Collinearity Errors
3.25 Programming
3.26 Exercises
4. Least Squares Regression
4.1 Introduction
4.2 Random Sampling
4.3 Sample Mean
4.4 Linear Regression Model
4.5 Expectation of Least Squares Estimator
4.6 Variance of Least Squares Estimator
4.7 Unconditional Moments
4.8 Gauss-Markov Theorem
4.9 Generalized Least Squares
4.10 Residuals
4.11 Estimation of Error Variance
4.12 Mean-Squared Forecast Error
4.13 Covariance Matrix Estimation under Homoskedasticity
4.14 Covariance Matrix Estimation under Heteroskedasticity
4.15 Standard Errors
4.16 Estimation with Sparse Dummy Variables
4.17 Computation
4.18 Measures of Fit
4.19 Empirical Example
4.20 Multicollinearity
4.21 Clustered Sampling
4.22 Inference with Clustered Samples
4.23 At What Level to Cluster
4.24 Technical Proofs
4.25 Exercises
5. Normal Regression
5.1 Introduction
5.2 The Normal Distribution
5.3 Multivariate Normal Distribution
5.4 Joint Normality and Linear Regression
5.5 Normal Regression Model
5.6 Distribution of OLS Coefficient Vector
5.7 Distribution of OLS Residual Vector
5.8 Distribution of Variance Estimator
5.9 t-Statistic
5.10 Confidence Intervals for Regression Coefficients
5.11 Confidence Intervals for Error Variance
5.12 t-Test
5.13 Likelihood Ratio Test
5.14 Information Bound for Normal Regression
5.15 Exercises
Part II. Large Sample Methods
6. A Review of Large Sample Asymptotics
6.1 Introduction
6.2 Modes of Convergence
6.3 Weak Law of Large Numbers
6.4 Central Limit Theorem
6.5 Continuous Mapping Theorem and Delta Method
6.6 Smooth Function Model
6.7 Stochastic Order Symbols
6.8 Convergence of Moments
7. Asymptotic Theory for Least Squares
7.1 Introduction
7.2 Consistency of Least Squares Estimator
7.3 Asymptotic Normality
7.4 Joint Distribution
7.5 Consistency of Error Variance Estimators
7.6 Homoskedastic Covariance Matrix Estimation
7.7 Heteroskedastic Covariance Matrix Estimation
7.8 Summary of Covariance Matrix Notation
7.9 Alternative Covariance Matrix Estimators
7.10 Functions of Parameters
7.11 Asymptotic Standard Errors
7.12 t-Statistic
7.13 Confidence Intervals
7.14 Regression Intervals
7.15 Forecast Intervals
7.16 Wald Statistic
7.17 Homoskedastic Wald Statistic
7.18 Confidence Regions
7.19 Edgeworth Expansion
7.20 Uniformly Consistent Residuals
7.21 Asymptotic Leverage
7.22 Exercises
8. Restricted Estimation
8.1 Introduction
8.2 Constrained Least Squares
8.3 Exclusion Restriction
8.4 Finite Sample Properties
8.5 Minimum Distance
8.6 Asymptotic Distribution
8.7 Variance Estimation and Standard Errors
8.8 Efficient Minimum Distance Estimator
8.9 Exclusion Restriction Revisited
8.10 Variance and Standard Error Estimation
8.11 Hausman Equality
8.12 Example: Mankiw, Romer, and Weil (1992)
8.13 Misspecification
8.14 Nonlinear Constraints
8.15 Inequality Restrictions
8.16 Technical Proofs
8.17 Exercises
9. Hypothesis Testing
9.1 Introduction
9.2 Hypotheses
9.3 Acceptance and Rejection
9.4 Type I Error
9.5 t-Tests
9.6 Type II Error and Power
9.7 Statistical Significance
9.8 p-Values
9.9 t-Ratios and the Abuse of Testing
9.10 Wald Tests
9.11 Homoskedastic Wald Tests
9.12 Criterion-Based Tests
9.13 Minimum Distance Tests
9.14 Minimum Distance Tests under Homoskedasticity
9.15 F-Tests
9.16 Hausman Tests
9.17 Score Tests
9.18 Problems with Tests of Nonlinear Hypotheses
9.19 Monte Carlo Simulation
9.20 Confidence Intervals by Test Inversion
9.21 Multiple Tests and Bonferroni Corrections
9.22 Power and Test Consistency
9.23 Asymptotic Local Power
9.24 Asymptotic Local Power,Vector Case
9.25 Exercises
10. Resampling Methods
10.1 Introduction
10.2 Example
10.3 Jackknife Estimation of Variance
10.4 Example
10.5 Jackknife for Clustered Observations
10.6 TheBootstrap Algorithm
10.7 Bootstrap Variance and Standard Errors
10.8 Percentile Interval
10.9 The Bootstrap Distribution
10.10 The Distribution of the Bootstrap Observations
10.11 The Distribution of the Bootstrap Sample Mean
10.12 Bootstrap Asymptotics
10.13 Consistency of the Bootstrap Estimate of Variance
10.14 Trimmed Estimator of Bootstrap Variance
10.15 Unreliability of Untrimmed Bootstrap Standard Errors
10.16 Consistency of the Percentile Interval
10.17 Bias-Corrected Percentile Interval
10.18 BCa Percentile Interval
10.19 Percentile-t Interval
10.20 Percentile-t Asymptotic Refinement
10.21 Bootstrap Hypothesis Tests
10.22 Wald-Type Bootstrap Tests
10.23 Criterion-Based Bootstrap Tests
10.24 Parametric Bootstrap
10.25 How Many Bootstrap Replications?
10.26 Setting the Bootstrap Seed
10.27 Bootstrap Regression
10.28 Bootstrap Regression Asymptotic Theory
10.29 Wild Bootstrap
10.30 Bootstrap for Clustered Observations
10.31 Technical Proofs
10.32 Exercises
Part III. Multiple Equation Models
11. Multivariate Regression
11.1 Introduction
11.2 Regression Systems
11.3 Least Squares Estimator
11.4 Expectation and Variance of Systems Least Squares
11.5 Asymptotic Distribution
11.6 Covariance Matrix Estimation
11.7 Seemingly Unrelated Regression
11.8 Equivalence of SUR and Least Squares
11.9 Maximum Likelihood Estimator
11.10 Restricted Estimation
11.11 Reduced Rank Regression
11.12 Principal Component Analysis
11.13 Factor Models
11.14 Approximate Factor Models
11.15 Factor Models with Additional Regressors
11.16 Factor-Augmented Regression
11.17 Multivariate Normal
11.18 Exercises
12. Instrumental Variables
12.1 Introduction
12.2 Overview
12.3 Examples
12.4 Endogenous Regressors
12.5 Instruments
12.6 Example: College Proximity
12.7 Reduced Form
12.8 Identification
12.9 Instrumental Variables Estimator
12.10 Demeaned Representation
12.11 Wald Estimator
12.12 Two-Stage Least Squares
12.13 Limited Information Maximum Likelihood
12.14 Split-Sample IV and JIVE
12.15 Consistency of 2SLS
12.16 Asymptotic Distribution of 2SLS
12.17 Determinants of 2SLS Variance
12.18 Covariance Matrix Estimation
12.19 LIML Asymptotic Distribution
12.20 Functions of Parameters
12.21 Hypothesis Tests
12.22 Finite Sample Theory
12.23 Bootstrap for 2SLS
12.24 The Peril of Bootstrap 2SLS Standard Errors
12.25 Clustered Dependence
12.26 Generated Regressors
12.27 Regression with Expectation Errors
12.28 Control Function Regression
12.29 Endogeneity Tests
12.30 Subset Endogeneity Tests
12.31 Overidentification Tests
12.32 Subset Overidentification Tests
12.33 Bootstrap Overidentification Tests
12.34 Local Average Treatment Effects
12.35 Identification Failure
12.36 Weak Instruments
12.37 Many Instruments
12.38 Testing for Weak Instruments
12.39 Weak Instruments with k2 >1
12.40 Example: Acemoglu, Johnson, and Robinson (2001)
12.41 Example: Angrist and Krueger (1991)
12.42 Programming
12.43 Exercises
13. Generalized Method of Moments
13.1 Introduction
13.2 Moment Equation Models
13.3 Method of Moments Estimators
13.4 Overidentified Moment Equations
13.5 Linear Moment Models
13.6 GMM Estimator
13.7 Distribution of GMM Estimator
13.8 Efficient GMM
13.9 Efficient GMM versus 2SLS
13.10 Estimation of the Efficient Weight Matrix
13.11 Iterated GMM
13.12 Covariance Matrix Estimation
13.13 Clustered Dependence
13.14 Wald Test
13.15 Restricted GMM
13.16 Nonlinear Restricted GMM
13.17 Constrained Regression
13.18 Multivariate Regression
13.19 Distance Test
13.20 Continuously Updated GMM
13.21 Overidentification Test
13.22 Subset Overidentification Tests
13.23 Endogeneity Test
13.24 Subset Endogeneity Test
13.25 Nonlinear GMM
13.26 Bootstrap for GMM
13.27 Conditional Moment Equation Models
13.28 Technical Proofs
13.29 Exercises
Part IV. Dependent and Panel Data
14. Time Series
14.1 Introduction
14.2 Examples
14.3 Differences and Growth Rates
14.4 Stationarity
14.5 Transformations of Stationary Processes
14.6 Convergent Series
14.7 Ergodicity
14.8 Ergodic Theorem
14.9 Conditioning on Information Sets
14.10 Martingale Difference Sequences
14.11 CLT for Martingale Differences
14.12 Mixing
14.13 CLT for Correlated Observations
14.14 Linear Projection
14.15 White Noise
14.16 The Wold Decomposition
14.17 Lag Operator
14.18 Autoregressive Wold Representation
14.19 Linear Models
14.20 Moving Average Process
14.21 Infinite-Order Moving Average Process
14.22 First-Order Autoregressive Process
14.23 Unit Root and Explosive AR(1) Processes
14.24 Second-Order Autoregressive Process
14.25 AR(p) Process
14.26 Impulse Response Function
14.27 ARMA and ARIMA Processes
14.28 Mixing Properties of Linear Processes
14.29 Identification
14.30 Estimation of Autoregressive Models
14.31 Asymptotic Distribution of Least Squares Estimator
14.32 Distribution under Homoskedasticity
14.33 Asymptotic Distribution under General Dependence
14.34 Covariance Matrix Estimation
14.35 Covariance Matrix Estimation under General Dependence
14.36 Testing the Hypothesis of No Serial Correlation
14.37 Testing for Omitted Serial Correlation
14.38 Model Selection
14.39 Illustrations
14.40 Time Series Regression Models
14.41 Static, Distributed Lag, and Autoregressive Distributed Lag Models
14.42 TimeTrends
14.43 Illustration
14.44 Granger Causality
14.45 Testing for Serial Correlation in Regression Models
14.46 Bootstrap for Time Series
14.47 Technical Proofs
14.48 Exercises
15. Multivariate Time Series
15.1 Introduction
15.2 Multiple Equation Time Series Models
15.3 Linear Projection
15.4 Multivariate Wold Decomposition
15.5 Impulse Response
15.6 VAR(1) Model
15.7 VAR(p) Model
15.8 Regression Notation
15.9 Estimation
15.10 Asymp

이용현황보기

Econometrics 이용현황 표 - 등록번호, 청구기호, 권별정보, 자료실, 이용여부로 구성 되어있습니다.
등록번호 청구기호 권별정보 자료실 이용여부
0003003301 330.015195 -A23-10 서울관 서고(열람신청 후 1층 대출대) 이용가능

출판사 책소개

알라딘제공

The most authoritative and up-to-date core econometrics textbook available

Econometrics is the quantitative language of economic theory, analysis, and empirical work, and it has become a cornerstone of graduate economics programs. Econometrics provides graduate and PhD students with an essential introduction to this foundational subject in economics and serves as an invaluable reference for researchers and practitioners. This comprehensive textbook teaches fundamental concepts, emphasizes modern, real-world applications, and gives students an intuitive understanding of econometrics.

  • Covers the full breadth of econometric theory and methods with mathematical rigor while emphasizing intuitive explanations that are accessible to students of all backgrounds
  • Draws on integrated, research-level datasets, provided on an accompanying website
  • Discusses linear econometrics, time series, panel data, nonparametric methods, nonlinear econometric models, and modern machine learning
  • Features hundreds of exercises that enable students to learn by doing
  • Includes in-depth appendices on matrix algebra and useful inequalities and a wealth of real-world examples
  • Can serve as a core textbook for a first-year PhD course in econometrics and as a follow-up to Bruce E. Hansen’s Probability and Statistics for Economists