Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Econom_Extra_heavvy_load_Shpora_FULL.docx
Скачиваний:
0
Добавлен:
01.03.2025
Размер:
3.02 Mб
Скачать

#1. What is Econometrics?

Econometrics may be defined generally as the application of mathematics and statistical methods to the analysis of economic data. Theoretical econometrics considers questions about the statistical properties of estimators in an economic model. Those properties may permit testing an economic hypothesis as to its acceptance or rejection on the basis of statistical inferences from the available data set.

#2. Nobel laureates (examples of models).

1) 1969. Jan Tinbergen, former Professor at the Erasmus University Rotterdam, and Ragnar Frisch were awarded the first prize in 1969 for having developed and applied dynamic models for the analysis of economic processes.

2) 1980. Lawrence Klein, Professor of Economics at the University of Pennsylvania, was awarded in 1980 for his computer modeling work in the field (econometric models and their application to the analysis of economic fluctuations and economic policies).

3) 1981. James Robin for his analysis of financial markets and their relations to expenditure decisions, employment, production and prices.

4) 1989. Trygve Haavelmo was awarded the Nobel Memorial Prize in Economic Sciences in 1989. His main contribution to econometrics was his 1944 article (published in Econometrica) "The Probability Approach to Econometrics." (probability theory foundation of econometrics and his analysis of economic structures)

5) Robert E. Lucas. For having developed and applied the hypothesis rational expectation and having transformed macroeconomic analysis and depend on our understanding of economic policies.

6) 2000. Daniel McFadden and James Heckman were awarded in 2000 for their work in microeconometrics. McFadden founded the econometrics lab at the University of California, Berkeley. (James Heckman – theory and methods for analyzing of sample; Daniel McFadden - theory and methods for analyzing discrete choice)

7) 2003. Robert Engle at the University of California, San Diego, and Clive Granger, at the University of Nottingham, were awarded in 2003 for work on analyzing economic time series. Engle pioneered the method of autoregressive conditional heteroskedasticity (ARCH) and Granger the method of cointegration.

#3. Role of Econometrics.

Two main purposes of econometrics are to give empirical content to economic theory by formulating economic models in testable form and to estimate those models and test them as to acceptance or rejection.

For example, consider one of the basic relationships in economics: the relationship between the price of a commodity and the quantities of that commodity that people wish to purchase at each price (the demand relationship). According to economic theory, an increase in the price would lead to a decrease in the quantity demanded, holding other relevant variables constant so as to isolate the relationship of interest. A mathematical equation can be written that describes the relationship between quantity, price, other demand variables like income, and a random term ε to reflect simplification and imprecision of the theoretical model:

Regression analysis could be used to estimate the unknown parameters β0, β1, and β2 in the relationship, using data on price, income, and quantity. The model could then be tested for statistical significance as to whether an increase in price is associated with a decrease in the quantity, as hypothesized: β1 < 0.

There are complications even in this simple example, and it is often easy to mistake statistical significance with economic significance. Statistical significance is neither necessary nor sufficient for economic significance.[7] In order to estimate the theoretical demand relationship, the observations in the data set must be price and quantity pairs that are collected along a demand schedule that is stable. If those assumptions are not satisfied, a more sophisticated model or econometric method may be necessary to derive reliable estimates and tests.

#4. Main Application of Econometrics.

 The field of econometrics has developed methods for identification and estimation of simultaneous equation models. These methods may allow researchers to make causal inferences in the absence of controlled experiments. Computational elements of econometric methods may in turn determine their usefulness.An econometric model is one of a range of tools used to replicate and simulate the main mechanisms of a regional, national or international economic system.

The primary application of econometric models in the area of evaluation is to simulate counterfactual situations, and thereby to quantitatively evaluate the net effects of public actions.

An econometric model is one of the tools economists use to forecast future developments in the economy.

#5. Cross Section Data.

Cross Section Data is data collected at given point of time. E.g. a sample of households or frms, from each of which are a number of variables like turnover, operating margin, market value of shares, etc., are measured.

From econometric point of view it is important that the observations consist a random

sample from the underlying population.

For example, we want to measure current obesity levels in a population. We could draw a sample of 1,000 people randomly from that population (also known as a cross section of that population), measure their weight and height, and calculate what percentage of that sample is categorized as obese. For example, 30% of our sample were categorized as obese. This cross-sectional sample provides us with a snapshot of that population, at that one point in time. Note that we do not know based on one cross-sectional sample if obesity is increasing or decreasing; we can only describe the current proportion.

Cross-sectional data differs from time series data also known as longitudinal data, which follows one subject's changes over the course of time. Another variant, panel data (or time-series cross-sectional (TSCS) data), combines both and looks at multiple subjects and how they change over the course of time. Panel analysis uses panel data to examine changes in variables over time and differences in variables between subjects.

In a rolling cross-section, both the presence of an individual in the sample and the time at which the individual is included in the sample are determined randomly. For example, a political poll may decide to interview 100,000 individuals. It first selects these individuals randomly from the entire population. It then assigns a random date to each individual. This is the random date on which that individual will be interviewed, and thus included in the survey.

#6. Time Series Data.

We observe one variable or several variables over T periods.

A time series consist of observations on a variable(s) over time. Typical examples are daily share prices, interest rates, CPI values.

An important additional feature over cross-sectional data is the ordering of the observa-tions, which may convey important information.

An additional feature is data frequency which may require special attention.

#7. Pooled Cross Section Data.

We observe two cross-sections over 2 different periods (the wage of an individual worker in 2 points of time)

It’s both time series and cross-section features. An example is a data set where a number of firms are randomly selected, say in 1990, and another sample is selected in 2000. If in both samples same features are measured, combining both years form a pooled cross- section data set.

Pooled cross-section data is analyzed much the same way as usual cross-section data.

However, many times it is important to pay special attention to the fact that there are 10 years in between. Usually the interest is whether there are some important changes between the time points. Statistical tools are usually the same as those used for analysis of differences between two independently sampled populations.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]