Exploring the Obscure

Logo

English translation of "Auf den Spuren des Verborgenen" (by Karen Horn), an article about the Nobel Prize 2000 in Economics, published in Frankfurter Allgemeine Zeitung, October 13, 2000.

At first glance, the works of this year's Nobel laureates in economics may scare many people off: James Heckman published "Simultaneous Equation Models with both Continuous and Discrete Endogenous Variables" and the "R2 Goodness of Fit Statistic For Models with Parameters Estimated From Microdata," while Daniel McFadden wrote about "Specification Tests for the Multinomial Logit Model" and "Efficient Estimation by Multinomial Approximation and Sequential Simulation." But this complicated economic mumbo-jumbo should be no reason to turn away in frustration.

The studies by Heckman and McFadden are not only, as you may have guessed, on a high mathematical level. They are also extremely useful; they shaped empirical economic research into what it is today: an instrument that must be taken seriously. They do not regard abstraction as an end in itself, but they enable us to do exactly what is so often called for: an integration of theory, empirics, and practice, of abstraction and application. At the same time, they construct a micro foundation for macroeconomic policy.

Heckman received the award for his development of theory and methods for analyzing so-called selective samples. Econometricians also talk about "self selection". In empirical practice, this happens all the time - namely, whenever the analyzed samples are not representative. This is the case, for instance, when you examine the relationship between wages and working hours, or the impact of college education on future income - all these are highly relevant questions.

In the case of wages and working hours, it lies in the nature of things that statistical data are only available for persons who are not currently unemployed. For all others, it is impossible to say how long they would be willing to work for a certain wage - simply because they do not supply labor on the market. In the case of college education, data on future income can only be collected for college graduates. This means that we cannot make a broad statement on the potential impact of education on the general wage level. We face the same dilemma when we try to analyze the employment prospects of long-term unemployed and the effects of active labor market policies: Since difficult-to-place individuals are likely to be overrepresented in longitudinal data, it is safe to assume that any estimates would be unreliable.

Heckman's solution to this problem is so logical and clear that it may almost seem trivial - though it creates substantial requirements on the technical side. He invented a method for dividing the determination of labor supply into the different stages of the decision-making process. The first step is to examine the probability with which each individual supplies labor on the market. In the second step, these individual probabilities are introduced as an additional explanatory variable into the working hours/wage model. Legal experts would refer to this process as subsumption.

McFadden's award-winning contribution was his development of theory and methods for analyzing so-called discrete choice. This is nothing less than a step toward reality. In traditional microeconomic theory, households are assumed to base their consumption decisions on goods prices alone. Depending on the available income, the quantity of consumed goods then ranges from zero to infinity. There is good reason for this assumption: Heuristic narrowing down to mostly linear relationships between means and ends helps to arrive at clear rules. Alas, in real life nothing is infinite, nor existing in infinitely small quantities.

Moreover, it is impossible to ascertain all influential factors. What should we do when we want to examine how three variables depend on three other (observable) variables, but we must suspect that there are other important factors that we cannot measure? For these (frequent) cases, McFadden invented the "conditional logit analysis," an econometric model that combines all non-measurable variables into one error term, for which a specific probability distribution is assumed. The two-stage process inherent in this analysis places its methodology in close proximity to Heckman's work.

The technical refinement of such models is not easy to grasp. Detractors of the mathematical approach to economics may argue that all this is a perfect illustration of how math in economics with its increasingly technical abstractions causes economists to slip ever more out of touch with reality and to stumble over their own feet. As a diagnosis, this is not so far from the truth. But trading one devil for another and forgoing the benefits of math altogether is a suggestion that comes close to an admission of one's own intellectual limits. Mathematical abstraction in economics would amount to a complete sell-out only if one were content with it. We cannot bring this accusation against Heckman and McFadden, nor against the many researchers who build upon their achievement.

Both Nobel laureates apply in practice what they have developed in theory - always with regard to socially and politically relevant issues. Heckman has a special interest in the labor market. In Germany, however, his methods are of hardly any use because the Bundesanstalt für Arbeit (Federal Employment Service) does not provide the necessary data. McFadden has done a lot of research on the (hidden) demand for non-excludable public goods, especially the environment, local public transport, and communication systems. Here we need tricks: Precisely because these are public goods, no one has reason to reveal their hypothetical willingness to pay for them. This presents the classic dilemma of the social sciences: Much of the needed information is not out in the open. But that's exactly what makes it so exciting.

 

© IZA  Impressum  Last updated: 2024-02-24  webmaster@iza.org    |   Bookmark this page    |   Print View