Tuesday, March 30, 2010

Bracketology Combinatorics

This is combinatorics, not economics. I suppose this sort of math is useful for political science in calculating power indices.

Consider a single-elimination tournament with 2n teams entered. Ignoring the play-in games, n is six for the NCAA basketball tournament. Some tournaments have "bys" on the first round, and I am ignoring that aspects of those tournaments too.

Let f(n) be the number of ways of filling out your brackets for the tournament. The problem considered here is to calculate f(6).

Since 2n teams are in the tournament, 2n - 1 games are played in the first round. Since there are two possible victors for each game, there are 2(2n - 1) possible ways of filling out your brackets for the first round. The total number of ways of filling out your brackets is then:
f(n) = 2(2n - 1) f(n - 1)
One could guess a closed-form solution for this difference equation and prove it correct by mathematical induction. If I recall correctly, generating functions are useful in finding the solution to such recurrence relations. But, since I am only interested in a few terms, I adopt the brute force calculations shown in Table 1.
Table 1: Tournament Counts
nNumber of
Teams
2(2n - 1)Ways of
Filling Out
Brackets
1222 = 21 = 100.301
2448 = 23 = 100.903
3816 = 24128 = 27 = 102.11
416256 = 28215 = 104.515
532216231 = 109.33
664232263 = 1019.0

Suppose, by some luck of the draw, you correctly chose the 32 winners of all the first round games. There would still be over one billion (109) possible ways of filling out the remainder of the brackets. Even if everybody in the world filled out a bracket for the full tournament, the overwhelming majority of possible brackets would still be unfilled.

Saturday, March 27, 2010

Lo and Mueller's Need for More Scholarship

Consider Andrew W. Lo and Mark T. Mueller's draft paper "WARNING: Physics Envy May Be Hazardous To Your Wealth!", to appear in the Journal of Investment Management. They argue that economists' "physics envy" has led to "a false sense of mathematical precision in some cases", and they illustrate their argument by pointing to Paul Samuelson. They distinguish between uncertainty and risk and offer a checklist to assess the degree of uncertainty in your decision-making environment. They mention chaotic dynamics.

I find their references lacking. They don't reference Philip Mirowski, particularly his book More Heat Than Light in which he considers Samuelson. They do reference Frank Knight's Risk, Uncertainty, and Profit, but not Keynes. They do not reference G. L. S. Shackle and his role on the development of scenario planning. No reference to Joan Robinson appears. I think her 1974 paper "History vs. Equilibrium", with its distinction between logical time and historical time, is particularly apropos. Paul Davidson's 1982 JPKE paper, "Rational Expectations: A Fallacious Foundation for Studying Crucial Decision-Making Processes" is also of some importance, with its emphasis on the mainstream economist's assumption of ergodicity. J. Barkley Rosser, Jr., with his treatment of insights from complex dynamics, is also unreferenced.

I don't see why one would want to read Lo and Mueller until they engage some of this literature on their point.

Wednesday, March 24, 2010

Me, Elsewhere

At NEP-DGE Blog, I commented on some post about some dynamic general equilibrium model explaining the distributions of wealth and income:
"Chapter 13 of Cockshott, Cottrell, Michaelson, Wright, and Yakovenko’s Classical Econophysics (Routledge, 2009) explains the distribution of income and wealth to some extent. They have social classes and are interested in statistical equilibrium, as in thermodynamics. I don’t know why one should care about what [one] can do in the failed neoclassical paradigm."
Over at Crooked Timber, I mentioned some books that I think were influential for me:
"I guess the Lord of the Rings is the book I’ve read the most times.

I read the Bible from cover to cover once at an early age.

One friend in college had a couple of serious books of physics. So, if I was going to spout off on politics, I ought to read some serious books on economics. The two I found in a used book store were Keynes’ General Theory (which I reread several times) and Von Neumann and Morgenstern’s Theory of Games. Part of the influence of these is to show me I can read original research, whether I understand it or not. I’ve read a number of books others have listed, but one can say that that’s a consequence of this lesson.

Somewhere I came across a reference to Joan Robinson as “the english Galbraith”. I had liked Galbraith, so I read her. I read a lot of her collections and then Sraffa’s Production of Commodities, as well as secondary literature such as Geoff Harcourt’s book, Some Cambridge Controversies. The lesson here is that almost everything economics professors were teaching me as an undergraduate had been shown to be mostly nonsense decades before.

Somewhere in here I read Schumpeter’s History and Hayek’s Individualism and Economic Order. Basically, I read Hayek before I found out right-wingers cite him without reading him. Why wouldn’t a leftist who has also read Orwell accept that Stalinist central planning couldn’t be expected to work well?

I had read a lot of commentary – I particularly like Harrington’s The Twilight of Capitalism – before reading Marx with understanding. I actually read Theories of Surplus Value before the first volume of Capital.

I found some works of economic history eye-opening – maybe Braudel’s Capitalism and Material Life, Hobsbawm’s The Age of Revolution: 1789-1848, or Polanyi’s The Great Transformation.

I’m not sure about what were the earliest works in philosophy that I think I might have understood somewhat – probably some Russell, Kuhn’s Structure of Scientific Revolutions, or Popper’s The Open Society and It’s Enemies. Wittgenstein’s Philosophical Investigations is on my list of books I’ve read multiple times."
With this exercise, you will see books on others lists that you maybe should have put on yours. Then there are all the books I haven't yet read, have unjustly forgot, or never understood in the first place. I'll refrain from commenting on any other comments on Crooked Timber, but I will note that young Matt Zeitlin includes Rorty's Achieving Our Country - a good book - on his list.

Sunday, March 21, 2010

James Galbraith "In Defense of Deficits"

James Galbraith speaks up against "one of the great misinformation campaigns of all time":
"To put things crudely, there are two ways to get the increase in total spending that we call 'economic growth.' One way is for government to spend. The other is for banks to lend. Leaving aside short-term adjustments like increased net exports or financial innovation, that's basically all there is. Governments and banks are the two entities with the power to create something from nothing. If total spending power is to grow, one or the other of these two great financial motors--public deficits or private loans--has to be in action." -- James K. Galbraith, "In Defense of Deficits", The Nation, 4 March 2010

Friday, March 19, 2010

Historians and Philosophers on Empirical Failures of Neoclassical Economics

Empirical evidence went against neoclassical economics in the following three cases:
  • Empirical studies and surveys of businessmen found that they followed a full cost policy, not marginalism
  • Behavioral economists have accumulated a body of experimental evidence, including preference-reversals and violations of transitivity, that people are not utility maximizers.
  • David Card and Alan Krueger found that increased minimum wages did not decrease employment.
These incidents present data for philosophers, historians, and sociologists of economics. They can explore how mainstream economists reacted to these empirical findings. And three have done just this. Daniel Hausman and Philippe Mongin compare and contrast the reactions to full cost pricing and preference reversals. Tim Leonard compares and contrasts the reaction of mainstream economists to their findings on full cost pricing and on the minimum wage. In keeping with current trends, these articles are descriptive, not prescriptive. That is, they try to understand the positions of participants without passing judgement.

References

Wednesday, March 17, 2010

Kaldor's Model of Industry and Agriculture

In comments, an anonymous poster asks:
Is the Kaldorian model on industrial and General productivity in an economy applicable in understanding economic development in LDC's like Zambia?
I don't know anything about Zambia.

As I understand it, Kaldor developed that model for the world as a whole. Thirlwall applied that model to a Less Developed Country in a 1986 paper.

In the model, labor is originally not scarce - there is disguised unemployment in agriculture. Industrial production, unlike in agriculture, experiences increasing returns. If wages are low in agriculture, there would be more savings to finance expansion of industry. But there might not be the demand for industrial products. Demand for industrial products is increased by a relatively low price of industrial products, as compared to the price of agricultural products. Demand for industrial commodities might also be for exports. A dynamic equilibirum arises in the model in which a steady state of growth is achieved and the terms of trade between agriculture and industry are specified. The model is supposed to capture certain stylized facts and exhibit a certain complementarity between agriculture and industry. It is also supposed suggest different possibilities, such as the possibility of economic development from favorable terms of trade for agriculture at an initial stage and export-oriented growth at a later stage.

Later essays in Thirlwall's book in which his essays was reprinted treat Africa, particularly the Sudan. This empirical work treats some other considerations than those included in the mathematical outlined verbally above. I don't know current thinking about these issues, although I question the thinking that had been dominant in the Internation Monetary Fund, that is, the "Washington consensus".

  • A. P. Thirlwall (1986) "A General Model of Growth and Development on Kaldorian Lines", Oxford Economic Papers (July) (Reprinted in The Economics of Growth and Development: Selected Essays of A. P. Thirlwall, Edward Elgar (1995))

Friday, March 12, 2010

Anti-Intellectualism Among Mainstream Economists

I find these comments to be anti-intellectual:
  • John Quiggin rejects the Austrian school of economics on the ground that partisans of that school discuss political philosophy and the epistemology and methodology of economics.
  • Roberto Perotti critizes Post Keynesians and neo-Ricardians on the grounds that they don't spend their time exclusively constructing formal models and estimating correlations. (I used Google's translation feature. Sergio Cesaratto answers from a Sraffian perspective.)
  • Commentators at Mark Thoma reject discussions about what Adam Smith wrote.
I thought the point of scholarship was to attempt to make true statements. If somebody makes an untrue statement about what Keynes or Adam Smith said, one should correct them. This is not to say that that the fact that Keynes or Smith advocated something or other is a justification for policy. I think a historically accurate representation of an old text entails quite a bit of contextualization in terms of its time. To apply policy conclusions to our time would require recontextualization in contemporary terms, as well as empirical work.

I would think different scholars, even within a discipline, would find different questions of interest. Some economists argue for a supposed freedom to choose. Shouldn't some then be legitimately allowed to explore old texts or methodology or whatever? If Thomas Kuhn was somewhat correct, wouldn't one expect more discussion about methodology when the defining paradigm in a field has so obviously broken down, as today among mainstream economists?

Thursday, March 11, 2010

On Sraffa, Elsewhere

Alex M. Thomas has begun a series of posts about Piero Sraffa's work in economics.

Saturday, March 06, 2010

Survey of Utility Theory?

1.0 Introduction
I think utility theory has a canonical textbook presentation. Many variations seem to exist. In some, the additional structure is imposed on the (commodity?) space over which agents choose. In others, more basic assumptions are made from which preferences can be derived under certain special cases.

I'd like to know if there are any surveys to read over these variations. I'm not insisting on something critical. And, given the dryness of the subject matter, I might not put such a survey on top of my queue. As can be seen below, I'm not sure of the field that would be demarcated by such a surveys. But literature surveys, in some sense, construct their object.

2.0 Textbook Treatment
Consider a space of n commodities. Each element of the space is a vector x = (x1, x2, ..., xn). Under the usual interpretation, xi is the quantity of the ith commodity.

An agent is modeled as having a preference relation, ≤, over the space of commodities. A typical question is what assumptions must hold for a utility function to exist. A utility function u(x) exists if, for all x and y in the space of commodities:
xy if and only if u(x) ≤ u(y)

Typically, the preference relation is taken to be a total order, that is, complete, reflexive, and transitive. A preference relation is complete if, for all x and y in the space of commodities,
xy or yx
A preference relation is reflexive if, for all x in the space of commodities
xx
A preference relation is transitive if, for all x, y, and z in the space of commodities,
if xy and yz then xz

If the quantities of commodities fall along a continuum, a preference relation being a total order is not sufficient for a utility function to exist. Lexicographic preferences are an example of a preference relation for which a utility function does not exist. A continuity assumption rules out this case. This assumption is that for all x in the space of commodities, the sets {y | yx} and {z | xz} of commodities not preferred to x and commodities x is not preferred to, respectively, are closed.

Theorem: If a preference relation is a total order and is continuous in the above sense, then a utility function exists.

The utility function is only defined up to a monotonically increasing transformation. In other words, utility is ordinal. Typical exercises are to show certain properties of utility functions, such as ratios of marginal utilities (du/dxi)/(du/dxj), are invariant over the set of such transformations.

3.0 Probability
Von Neumann and Morgenstern generalized the commodity space to include vectors of the form: (p1, x(1); p2, x(2); ..., pm, x(m)), where:
p1 + p2 + ... + pm = 1
A commodity, in this sense, is a lottery. Each superscripted commodity vector x(i) is associated with a probability pi that it will be chosen.

Von Neumann and Morgenstern defined a new set of axioms to go along with their redefined commodity space. One implication is that for any two elements x and y in the commodity space, the linear combination (p, x; (1 - p), y) is also in the space. They obtain that a utility function exists, and it acts like mathematical expectation:
u(p1, x(1); p2, x(2); ..., pm, x(m)) = p1 u(x(1)) + p2 u(x(2)) + ... + pm u(x(m))

Under Von Neumann and Morgenstern's approach, utility functions are only defined up to affine transformations. That is, they are cardinal. In other words, they attain an interval measurement scale level. The utility for a lottery depends only on the probabilities and the resulting outcomes. It does not depend on how many spins of the wheel or roll of the dice are needed to decide between otherwise equivalent lotteries. Gambling is assumed to have no utility or disutility.

Leonard Savage develops axioms of probability concurrently with axioms of utility theory in his personalistic approach to probability and statistics. I'm not sure how much the survey I would like would go into approaches to probability, even if probability is important to decision theory. The same comment applies to game theory.

4.0 Attributes and Needs
Some see commodities as being chosen as an indirect means to choose something more abstract. As I understand it, Kevin Lancaster depicts a commodity as a bundle of attributes. Different commodities can have some attributes in common. A choice of an element in the space of commodities can then be related to an element in a space of commodity attributes.

The early Austrian school economists thought of goods as being desired for the satisfactions of wants. Water, for example, can be used to water your lawn, to satisfy a pet's thirst, or to drink yourself. One can imagine ranking wants in disparate categories. I am thinking of the triangular tables in Chapter III of Carl Menger's Principles of Economics, in Book III, Part A, Chapter III of Eugen von Böhm-Bawerk's Positive Theory of Capital, and in Chapter IV of William Smart's An Introduction to the Theory of Value. The tables are triangular because the most pressing want in one category typically is less pressing than the most pressing want in another category. An element in the space of commodities corresponds to the set of wants that the agent would choose to satisfy with the quantities of commodities specified by that element.

This mapping from quantities of commodities to sets of wants leads to a redefinition of marginal utility, which one might as well designate by a new name - marginal use. The marginal use of a quantity of commodity is, roughly, the different wants that would be added, with a set union, to the set of wants satisfied by the the given quantities of commodities with that additional quantity of the given commodity. McCulloch shows that a ranking of wants in different categories can arise such that a measure does not exist for the space of sets of wants. (A measure in this sense is a technical term in mathematics, typically taught in courses in analysis or advanced courses in the theory of probability.) He argues that the Austrian theory of the marginal use is thus ordinal. Surprisingly, his argument implies that the law of diminishing marginal utility does not require utility to be measured on a cardinal scale.

I haven't read Ian Steedman's work on consumption, but I think I'll mention it here.

5.0 Choices from Menus
Another generalization of the textbook treatment is to examine how a preference relation can be built out of a more fundamental structure. Imagine the agent is presented with a menu, where a menu is a nonempty set of elements of the commodity space. The agent is assumed to have a choice function, which maps each menu to the set of best choices, in some sense, in that menu. The agent is not postulated to rank either the elements not chosen for a given menu or the elements in the choice set.

A question: what constraints need to be put on choices out of menus such that preferences exist? Since a choice function can be constructed for which no preference function exists, some such constraints exist. I previously noted literature drawing on the logical structure of social choice theory in this context. Alan Isaac emphasizes temporal and menu independence in his overview of abstract choice theory.

6.0 Experimental Economics
I am emphasizing theory. A literature exists on experiments, many of which have falsified the textbook treatment of economics.

7.0 Computatibility, Conservation Laws, Etc.
Some of the above extensions of the textbook treatment seem to postulate some sort of structure within the agent's mind. Computers provide an arguable metaphor of mental processes, and some literature applies the theory of computability to economics. Gerald Kramer, for example, shows that no finite automaton can maximize utility in the simplest setting. I gather others have shown that the textbook treatment postulates that each agent's computation powers exceed those of a Turing machine, that agents compute functions that are, in fact, noncomputable. I turn to Kumaraswamy Velupillai's work for insights into computability, constructive mathematics, and economics. Philip Mirowski is always entertaining. One might also mention the literature on Herbert Simon's notion of satisficing

8.0 Conclusion
This post is a brief overview of some of what would be treated in a survey of variations and approaches to utility theory. Apparently, the notion of economic man can be complicated.

An Incomplete List of References
  • Colin F. Camerer (2007) "Neuroeconomics: Using Neuroscience to Make Economic Predictions", Economic Journal, V. 117 (March): C26-C42.
  • Alan G. Isaac (1998) "The Structure of Neoclassical Consumer Theory"
  • Daniel Kahneman and Amos Tversky (1979) "Prospect Theory: An Analysis of Decision under Risk" Econometrica, V. 47, N. 2 (March): pp. 263-292
  • Gerald H. Kramer () "An Impossibility Result Concerning the Theory of Decision-Making", Cowles Foundation Paper 274
  • Kevin J. Lancaster (1966) "A New Approach to Consumer Theory", Journal of Political Economy, V. 75: pp. 132-157.
  • J. Huston McCulloch (1977) "The Austrian Theory of the Marginal Use and of Ordinal Marginal Utility", Journal of Economics, V. 37, N. 3-4: pp. 249-280.
  • Judea Pearl (1988) Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, Morgan Kaufmann
  • Leonard J. Savage (1954, 1972) The Foundations of Statistics, Dover Publications
  • Chris Starmer (1999) "Experimental Economics: Hard Science or Wasteful Tinkering?" Economic Journal, V. 109 (February): pp. F5-F15
  • Ian Steedman (2001) Consumption Takes Time: Implications for Economic Theory, Routledge
  • S. Abu Turab Rizvi (2001) "Preference Formation and the Axioms of Choice", Review of Political Economy, V. 13, N. 12 (Nov.): pp. 141-159
  • John Von Neumann and Oskar Morgenstern (1953) Theory of Games and Economic Behavior, Third Edition, Princeton University Press

Wednesday, March 03, 2010

Labor Market Flexibility

Some mainstream economists claim that unemployment would be less if labor markets were more flexible and less rigid. In a 1998 paper arguing against this view, Bob Solow explains what this labor market rigidity that so many mainstream economists, especially "freshwater" economists, want to abolish is:
"My first observation is that 'labour-market rigidity' is never defined very precisely or directly in this context, but only be enumeration of tell-tale symptons. Thus a labour market is inflexible if the level of unemployment-insurance benefits is too high or their duration is too long, or if there are too many restrictions on the freedom of employers to fire and to hire, or if the permissible hours of work are too tightly regulated, or if excessively generous compensation for overtime work is mandated, or if trade unions have too much power to protect incumbent workers against competition and to control the flow of work at the site of production, or perhaps if statutory health and safety regulations are too stringent. It seems clear that those who point to labour-market rigidity as the source of high unemployment have something other than simple nominal or real wage rigidity in mind, or so shall I assume." -- Robert M. Solow, "What is Labour-Market Flexibility? What is it Good for?", Proceedings of the British Academy, V. 97
As I understand it, many of these "rigidities" were put in place in the United States context around the time of the New Deal with the cooperation and assistance of Institutionalist economists, a school with some sense of the real world.