3.9.12

"(Econo)metrics: from Political Arithmetic to the Probability Revolution": Abstracts

JOURNÉE D'ÉTUDES

(Écono)métrie: de l’arithmétique politique 
à la révolution probabiliste

(Econo)metrics: from Political Arithmetic 

to the Probability Revolution 

Rome, 7 September 2012
École française de Rome
Piazza Navona, 62
00186 Roma

-----------------------------------

Abstracts

Quantifying the Social Sciences: An Historical and Comparative Perspective

Alain Desrosières (INSEE, Paris)

The various social sciences have gradually been quantified since the middle of the 19th Century. This quantification was seen as a symbol of the attainment of a scientific status, comparable to that enjoyed by natural sciences: “There is no science without measure", said a slogan of the 19th Century. But this process followed different paths for each of these disciplines. If the history of the process of quantification is now well documented by several studies, less frequent are the attempts to compare social sciences from this point of view. The way each science did integrate the statistical and probabilistic tools tells something not only about its own epistemology and methodology, but also, in the sociology of science perspective, about its actors, its networks, its norms, its criteria of legitimacy, its controversies.
Without actually replying to such a broad question, we will propose here a small comparative panel including five disciplines: history, sociology, political science, economics and psychology. Each one is in itself a complex world, split into different trends, into “schools” implying different paradigms; and controversies, when they are not harsh conflicts, are usually present. However, what distinguishes a disciplinary field is a relative agreement on what people do not agree about, where the people concerned are quite used to confrontation. On the other hand, confrontations between one discipline and another are unusual, for reasons related to the sociological boundaries of academic and scientific communities. Each discipline is a disciplined world, largely confined to itself, with its vocabulary, its paradigms, its institutions, its chairs, its journals. That's why choosing the history of quantification as an interpretative framework and as a symptom of something that would be characteristic of these five worlds, may be a good idea, even if this exercise is very simplistic.


Symptoms and Measurement: Studying Economic Reality in France and Italy Before WWI  

Alberto Baffigi (Banca d’Italia)

In the late eighties of the nineteenth century, scholars such as the Austrian Franz Xaver von Neumann-Spallart (1837-1888) and French Alfred de Foville (1842-1918) made important contributions that favored the rapprochement between economics and statistics. The economic depression in the seventies had strengthend scholar’s awareness of the intrinsic instability of the economic system which was born with the industrial revolution. This led economists to use the information and statistical methods available to measure and predict the behavior of economic activity. The scholars engaged in the new discipline behaved, metaphorically, as the doctor who performs diagnostics on the human body; they performed interpretation of signs: economic semiology. This line of research witnesses the need of the economists to bridge the gap between theory and economic reality.
The history of economic semiology shows that the development of this discipline was largely a French and Italian business: the main founder of the discipline, De Foville was French. The Italian Maffeo Pantaleoni, who was one of the main promoters of this line of research, wrote an important theoretical article about semiology, in 1892, in French, on Charle Gide’s Revue d'économie politique. Pantaleoni was very influential and had an impact on Rodolfo Benini, on Giorgio Mortara, Costantino Ottolenghi and the Belgian statistician Armand Julin. In 1913, during the fourteenth session of the International Institute of Statistics, Julin presented a proposal to establish, within the Institute, a special commission appointed to study the statistical methods related to semiology. Among the sixteen subscribers of the proposal we find Rodolfo Benini, Maffeo Pantaleoni and Lucien March. In a rapid survey of the major contributions in the history of the discipline, also listed Julin, in addition to his work, that of the two founders of the discipline, Neumann-Spallart and de Foville, those of French André Liesse (1854-1954) and Yves Guyot (1843-1928), and those of Rodolfo Benini (1862-1956), Augusto Bosco (1859-1906) and Maffeo Pantaleoni (1857-1924).
Although not strangers to the positivist culture, proponents of economic semiology did not identify knowledge with scientific induction. The facts bear, ultimately, no informational content if they are not observed through the lenses of a theory previously developed to detect and interpret them. As pointed out by Rodolfo Benini, in its Principi di statistica metodologica (1906), "regular patterns found in observed cases cannot be extended to cases outside of our observational field, without a bridge between the known and the unknown. This bridge is the hypothesis". It is not useless here to remind that only 4 years earlier Henri Poincaré had expressed his conventionalist epistemological view in “La science et l’hypothese”.
The economic semiology should be framed in the contemporary epistemological debate that, aware of the process of crisis of positivism, looked for a solution inside the science, in the accurate definition of the problems of scientific research, language and logic; these thinkers rejected irrational opposition to positivism, keeping their positions far from idealism and from the “post-modern” nietzschean position. The rise of economic semiotics required a change of perspective, a way out of the doldrums where the positivist metaphysics had ended up. Upstream of the drive to study the symptoms of the economic movement, there was an anti-realist and empiricist epistemological shift, there was logic pragmatism and Empiriocriticism. Only this change of path could give rise to semiology: the interpretation of signs requires a theory, a logical theoretical framework within which to bring the symptoms observed, which otherwise would be meaningless accidents.



The Early Years of the Bureau of Agricultural Economics. Price and Crop Outlook Studies, 1922-1930

Eric Chancellier (University of Lorraine)

The U.S. Department of Agriculture decided to create on July 1, 1922, the Bureau of Agricultural Economics (BAE). The agricultural depression of the early 1920’s, which brought change in relative prices of farm products as well as a general lowering of the farmer’s purchasing power, gave the signal for renewed emphasis on the study of the dynamic forces which bring about price changes and for a study of the possibilities of a reshaping of agriculture to adjust supply to demand on the basis of a satisfactory price. Henry C. Taylor – head of BAE – began an ambitious program of producing an annual outlook for agricultural production and prices. He developed a program for collecting agricultural statistics and for developing quantitative methods in order to forecast prices commodity and crops (Taylor and Taylor, 1954). This paper is organized as follows. The first section will present the new method of the BAE about the analysis of prices. We will use the works of Bean (1929a, 1929b) and Ezekiel (1923, 1924) - BAE economists -. These authors develop a new analytical method - multiple and partial correlation method - designed to better understand the formation of agricultural prices. Our second section will be devoted to the crop forecasting. For this, the BAE study focuses on three directions: the estimation of acreage (Bean, 1930), the estimation of crop yields (Smith, 1925a) and the impact of weather on crops (Smith, 1925b). All these authors use extensively the correlation tool to enable better predictions.
Bean L.H. 1929a. A Simplified Method of Graphic Curvilinear Correlation. Journal of the American Statistical Association, 24(168) : 386-397.
_______. 1929b. The farmers’ response to price. Journal of Farm Economics, 11(3): 368-385.
_______. 1930. Application of a simplified method of correlation to problems in acreage and yield variations. Journal of the American Statistical Association, 25(172): 428-439.
Ezekiel M. 1923. On the use of partial correlation in the analysis of farm management data. Journal of Farm Economics, 5(4): 198-213.
_______. 1924. A method of handling curvilinear correlation for any number of variables. Journal of the American Statistical Association, 19(148): 431-453.
Smith B.B. 1925a. Forecasting the acreage of cotton. Journal of the American Statistical Association, 20(149): 31-47.
_______. 1925b. Relation between weather conditions and yield of cotton in Louisiana. Journal of Agricultural Research, 30: 1083-1086.
Taylor H.C. et Taylor A.D. 1952. The story of agricultural economics in the United States, 1840-1932. Men, services, ideas. Iowa State College Press, Iowa.


Difficulties and ambiguities of a probabilistic econometrics

Michel Armatte (Université Paris Dauphine et EHESS/ Centre A. Koyré) 

The history of econometrics is now rich of many studies, but it has suffered from an internalist methodological vision that singled it out of the history of science and of the same socio-political context where it developed, and from a hagiographic enthusiasm that made it a too easy success story. As some recent researches suggest (eg Boumans and Dupont-Kieffer, 2011), the now growing rejection toward a paradigm that was depleted in the 1980s allows us to revisit this history with new questions. We propose here to read the history of econometrics in light of the questions concerning the possibility of a probabilistic economics. The probabilistic economic modeling, seen through its various dimensions – metaphysical (ontological vs. epistemic chance), mathematical (the basis of probable calculation), statistical (frequentist or subjectivist estimate), and pragmatic (the social effects it produced) – has indeed been paid little attention by standard historiography.
We will start from two puzzling questions:
  1. Historians should explain how a paradigm supported by a small group of people (the Cowles Commission), well trained in mathematics, probability and statistics, was able to convince a global community of economists, who were totally ignorant or allergic to any consideration of randomness and of probability calculus, that the solution to economic issues went through structural and probabilistic modeling. Only a bundle of reasons related to the specific scientific regimes of WW2 and of the Cold War can explain this.
  2. In 1859, the statistical physics of Maxwell and Boltzmann, as well as the Darwinian theory of the evolution of species, revolutionized these two disciplines by giving chance - a chance which was not epistemic, but ontologic (or objective, as Cournot said) - a constitutive place in their theory. Following the studies of the Bielefeld group on the probabilistic revolution, and some other historical works, it seems that the economic thought did not experience a similar revolution either in the nineteenth century, nor in the 1930s with the IES.
This paper will remind that the econometricians' reference to Cournot concerned almost exclusively his concept of a mathematical economics, and not the idea of objective chance. On the other hand, as long as randomness and probability are concerned, a reference to Quetelet and Yule would be more fit to the econometrics of Tinbergen and Frisch, and of the IES, and even of the Cowles Commission. The type of chance that presides over the randomness affecting the economic relations could be identified with an error or a noise disturbing a deterministic relationship, rather than with an intrinsic variability of the concerned actors and phenomena. The project of unifying the mathematical and statistical approaches that was its creed was largely hampered by the exclusion of the hazard, which was evident in the works and correspondence of its founders I. Fisher, C. Gini, R. Frisch and F. Divisia. We will reassess their positioning with regard to randomness and probability, and, following Morgan, Le Gall, and Mirowski, we will revisit the manifesto by Haavelmo (1944) and a text by Marschak (1948) that are at the heart of the Cowles Commission's creed in a probabilistic structural econometrics, in order to detect the justifications, the interpretations, and the limitations affecting the introduction of randomness in economics.


Frisch's Approach: Econometrics as the Science of Measurement. Modelling as Intertwining between Theoretical Analysis and Statistical Investigation
Ariane Dupont-Kieffer (Université Paris-Est, IFSTTAR, DEST)
This paper aims to investigate how Ragnar Frisch has based econometrics on a specific articulation between theoretical measurement and empirical measurement. His starting point is to show that humankind has a constant need to believe in the existence of regularities ruling the physical world as well as the social one, and in the fact that the understanding of these regularities allows human beings to influence their environment. The knowledge of these regularities is rooted in Frisch’s perspective on the synthesis of a reductionist reasoning and of a physicalist approach. This conception of science begins with three requirements: 1) the use of mathematical tools for scientific investigation on the equation between scientific laws and quantitative laws, 2) the primacy of measurement procedures in the scientific work and 3) the need to articulate theoretical measurement and the empirical one. These three requirements explain the key role played by the model in his process of scientific discovery. The model is then a “mediator” according to the analysis developed by Morgan and Morrison (1999) between economic theory - the corpus developed around the works of Walras, Marshall, Pareto and mainly Irving Fisher - and the “reality” as “reflected” in the statistical data, and modeling leads Frisch to define a specific methodology of experiment.


Schumpeter, Frisch and Lucas: The Oscillations of Economics while Dealing with Oscillations in the Economies

Francisco Louçã, ISEG-UECE, Lisboa

The paper compares the discussions between Joseph Schumpeter and  Ragnar Frisch, in the early thirties, and the discussions introduced by Robert Lucas and the RBC school, more than fifty years later, on the nature (and models) of oscillations in the economies.
Typically, economics addressed these oscillations in the general equilibrium framework, and the distinction between an impulse system and a propagation apparatus was instrumental for such purpose. Yet, many econometricians suspected and discussed this framework, even some of the more unsuspected lawyers of general equilibrium.
These discussions highlight early approaches of complexity in economics.

Nessun commento:

Posta un commento