Thursday, July 7, 2011

Second Eggs

I read Tim Harford's book Undercover Economist; in the second edition is an argument about
mortgage-backed securities and further derivatives.

The author must have included this on the notion that some financial instruments which were involved in the world's recent financial crisis are similar to "second eggs."  He mentions that mortgage-backed securities were derivative on mortgages being generally safe, but that further derivatives were taken as well.  And so he suggests we consider a toy model in which you are offered an investment which we'll call an egg.  The investment will probably deliver value.  On the other hand, it might become worthless.  We estimate that eggs go bad with probability p.  From eggs, we get an investment called "second eggs" which names a "carton" of 6 eggs and pays off if the number of bad eggs turns out to be 0 or 1.  From e second eggs, we can of course get a carton of second second eggs, and so on.  

This emphasizes that while p represents risk, estimating p wrong is also a risk.  And yet, a basic notion in probability theory is that we have to guess something and then reason to a conclusion.  We have to guess p, and then see what the consequences are.  Or we can guess a distribution for p and see what the consequences are.  But what are the consequences of guessing the wrong distribution?  Probability theory should hope that the recursion stops here... that a distribution of distributions for p is not any different from a distribution for p.  And we could hope that there are only two sorts of error:

-- the probability p that an investment goes sour;

-- the error in estimating p; call it p'.   

He points out that if eggs go bad with probability p, then "the second egg to go bad" is an investment which goes sour with probability q.  He then points out two interesting things: if p is below 5%, then q < p.  I.e., the second bad egg in a batch of six is a safer bet than buying a single egg.  On the other hand, if p is 10% or more, then q > p.  Supposing that we no extra information about second eggs than that they are ... second eggs... then to estimate q we must minimize p'.  If we can't guarantee that p < 5%, then we can't guarantee that q < p.  I made this table just to check.  In the left column, the probability that one egg is bad is 5%; in the right column it is 10%.  Then we calculate the probability that the number of eggs which are bad is > 2 in a batch of 6. 

Example from Tim Harford book Undercover Economist, version 2.
batch six
eggs, and rank them with the worst ones first (perhaps this is revealed
that one egg is bad:
0.05 0.1
of the number of bad eggs:
0 0.735091891 0.531441
1 0.232134281 0.354294
2 0.030543984 0.098415
3 0.002143438 0.01458
4 8.46094E-05 0.001215
5 1.78125E-06 0.000054
that the first egg is bad:
0.264908109 0.468559
that the second egg is bad:
0.032773828 0.114265
that the third egg is bad:
0.002229844 0.01585
that the fourth egg is bad:
8.64063E-05 0.00127
that the fifth egg is bad:
1.79688E-06 5.5E-05
that the sixth egg is bad:
1.5625E-08 1E-06
 (The author mistakenly rounds 0.26490810 to 0.27.)
batch six SECOND eggs, and rank them with the worst ones first
(perhaps this is revealed later).
that one second egg is bad:
0.032773828 0.114265
of the number of bad second eggs:
0 0.818781906 0.482862
1 0.166463346 0.373752
2 0.014101255 0.12054
3 0.000637082 0.020734
4 1.61903E-05 0.002006
5 2.1944E-07 0.000104
that the first egg is bad:
0.181218094 0.517138
that the second egg is bad:
0.014754748 0.143386
that the third egg is bad:
0.000653493 0.022846
that the fourth egg is bad:
1.6411E-05 0.002112
that the fifth egg is bad:
2.20679E-07 0.000106
that the sixth egg is bad:
1.23926E-09 2.23E-06
batch six
second second eggs, and rank them with the worst ones first.
that one second egg is bad:
0.014754748 0.143386
of the number of bad second second eggs:
0 0.914673513 0.395103
1 0.082187318 0.396811
2 0.003077034 0.166053
3 6.1441E-05 0.03706
4 6.90092E-07 0.004653
5 4.13385E-09 0.000312
that the first egg is bad:
0.085326487 0.604897
that the second egg is bad:
0.003139169 0.208086
that the third egg is bad:
6.21353E-05 0.042033
that the fourth egg is bad:
6.94236E-07 0.004973
that the fifth egg is bad:
4.14417E-09 0.00032
that the sixth egg is bad:
1.03176E-11 8.69E-06

Friday, June 24, 2011

Two-Envelope Paradox II

This post continues a discussion of the Two-Envelope Paradox which has been going on at 
Res Cogitans blog.

Suppose you enter a game show where two envelopes are on display.  You are told that envelopes contain gameshow points - points are translated into currency at a constant (linear) rate c at the end of the show.  You are initially asked whether you prefer one envelope to the other.  They are blank and identical, so you do not.  You are then shown the value in one envelope -- 2000 gameshow points.  At this point, you might reason:

1. as you don't know c, there is still no point in swapping.
2. if there is no point in swapping, then the expectation value of swapping is 2000.
3. if the expectation value is 2000 then 2000 = p.1000 + (1-p).4000
4. the probability of swapping and getting 1000 is 2/3.
5. this calculation holds whatever value was in the first chosen envelope.

  i.e., p(the other envelope is worse | your first envelope's value is X) = 2/3 for all conditions x assignign a value X to the first envelope.

6. when given the initial choice of envelopes you have a 2/3 probability of choosing the higher amount.

0 and 6 clearly contradict.  Where did the reasoning go wrong?

I believe it went wronge here: 6 does not follow from 12345.

From p(y|x) for various x we want to infer p(y).  For let y be the case "the other envelope is better," and let x be the condition "the value in the envelope is X."  If we had an integrable prior distribution of envelope values, we could reason using the "law of total probability" that p(y) = sum_x p(y|x) dX.  When the distribution of X is not integrable, can we reason according to a law such as the following?

Law?: if p(y|x) is constant as x ranges over all possible conditions, then the total probability p(y) takes that same value.

When X is not integrable, we can gather conditions x into pairs and find that the probability p(y|pairs of x) is twice as good; then p(y) would take on a larger value by that same law.  That law is not valid.

Arguments 1..5 tease out the consequences of "indifference." I don't think 1..5 are logically necessary at all.  They make a compelling argument about the implications of indifference: what to do if you don't know what gameshow points are worth to you, and you only know that more points are worth more to you. 
I think that once you make the assumption that utility is linear in gameshow points and that p(y|x) is constant , then I think the chain of reasoning 1..5 is correct, leading inexorably to the conclusion 5. But one could go on reasoning after 5, reaching problematic conclusions about small amounts of money and for large amounts of money.  I will try to do that now.

7. Arbitrarily small values exist in the game.  Whatever values we see, we should have expected smaller ones.

What is the domain of the gameshow points?  Is there a minimal value?  1...5 constrains the distribution of envelopes so that for all x a value of x/2 is always twice as likely as 2x.  So, there is no minimal amount of money in this game, and the probability of tiny amounts outweighs the probability of large amounts.  If the first envelope contains 0.01 gameshow points, then I might still switch down without losing expected value because in fact the smaller values are more likely.  There is no "gameshow cent."  1 is rarer than 0.1, which is rarer than 0.01, which is rare than 0.001, etc., so the treatment of small values is not going to come up only rarely.  Perhaps part of the paradox is being pushed into the gap between 0 and arbitrarily small values.  As a result, gameshow points cannot translate into real money.  We could suppose that there is a constant c, such that gameshow points times c = currency in your hometown.  But currency in your hometown doesn't exist in arbitrarily small amounts.  So there has to be a minimal gameshow value.  But by arguments 1..5 there is none.

8. The game show is obtaining its values in some way other than choosing them from a distribution. 

In order to compute p(y) from  p(y|x) using the law of total probability and in order to assign probabilities, we should form an integrable prior distribution.  This will simply violate 1: the notion that you don't know what gameshow cash is worth, so you will treat all values the same.  Of course, you can resist forming a prior distribution, and still reason... but then an "implied distribution" is still present behind your actions.  The implied distribution seems to be x^(-0.5), since for that distribution the chance of switching down is twice the chance of switching up.  This distribution is not integrable.  Not only does it put tremendous weight on small values, it effectively puts nonzero weight at very large values (or at infinity).  if we would cut this distribution off at any arbitrary top and bottom value, we would find a very interesting fact: you *should* switch up from small values, but for top values you should not.  How is the game show supposed to generate its random envelope values?  It must have, in fact, an integrable distribution.  More likely, it has a bucket full of envelopes -- a discrete distribution.  The customer may well not know it, but some such distribution must exist in order to create the game show.

Tuesday, June 21, 2011

Wallet Wrongfully Returned

Losing your wallet at an economics conference:

You go to an economics conference.  The speaker takes wallets from two members of the audience and offers to combine the total and give it to whomever had the lesser sum.  You stand to gain the larger sum (if you had the smaller) or lose the smaller sum (if you had the larger). 

You should expect a certain distribution of wallets.  You should be willing to play the game if :

sum-of-richer-wallets > your wallet * number of poorer people.  The median person in the audience should be happy to play this game --  if there are (R-1)/2 people richer and (R-1)/2 people poorer, then the first sum exceeds the second sum, since each richer-wallet is richer than your wallet.  But of course, the audience as a whole doesn't get richer if we play... just the median person.

Can we design a group of players for whom all of them should play except the top one?  Yes, of course: we give the poorest person 0.  The next person has 0 too.  If we ever rise above 0, then that person's rank, from least to greatest, is r.  This first nonzero wallet value is 1, say.  The next wallet must have value r, the next r(r+1), and so on.  A super-exponential distribution of wallets insures that everyone but the richest person is willing to play.  But this is hardly a paradox... if the richest person is carrying an industrial nation's wealth in traveller's checks, this character might *strongly* resist playing this game with a bunch of people with relatively empty wallets.

Suppose there are (R-1) other people in the audience. 

We choose another member of the audience randomly from R-1 people. 

You should begin by guessing a distribution of wallets.  Just guess some numbers and assign them to the audience.  Calculate for each person his or her rank r in the list of wallet-values w, ordering people from the richest to the poorest.  Let R be the maximum value of r, i.e., the number of people.  Let c be the cumulative sum of the wallets so that to each rank r we assign c(r) = the wealth in the biggest r-1 wallets.  Let C be the sum of all wealth.  The person with rank r has R-r ways to lose w and (r-1) ways to gain an expected c/(r-1).  The expected benefit to the player with rank r is: c/(R-1) - w * (R-r)/(R-1).  If w is normally distributed with respect to r, then the majority of players expect a small benefit, but a few players expect a small benefit. 
Or if R = 10 and the wallet-sizes are 5,4,4,4,3,3,3,2,2,2, we can arbitrarily choose which 4 should lose to which 4, and get a linear order:

 [r]  1  2  3  4  5  6  7  8  9 10 -- rank.
 [w]  5  4  4  4  3  3  3  2  2  2 -- wealth or wallet.
 [c]  0  5  9 13 17 20 23 26 28 30 -- what you can win; the sum of the wallets from people who will have to pay you.

If I am the fourth-highest wallet-holder then with chances 3/9 I will win 13/3.  I.e., my expected winnings are 13/9.   With chance 6/9 I will lose 4.  I.e., my expected loss is w/(R-1)...

Tim Harford in his book _Undercover Economist_ mentions the following twist:  We now action the prize w1+w2 to the two owners of the two wallets.  What is their winning strategy in an auction (with English-auction rules)?  Each player makes an estimate of the other player's wallet: e1 and e2 are the estimates of w1 and w2.  Now player 1 expects the value to be w1+e2 and player 2 expects
the value to be e1+w2.  If player 1 bids w1+e2 and player 2 drops out, then player 1 pays w1+e2 to earn w1+w2.  He has paid e2 to win w2, which he perceives as a random variable with expected value e2.  The information available to him in the action -- that player 2 continued bidding until w1+e2, can only improve his estimate of e2.  Thus, each player should be happy to obtain the
other player's wallet at its expected value.  Perhaps they should bid a little higher in order to take into account each other's information.

Monday, June 20, 2011

The two-envelope problem

Suppose you go to a conference where the speaker invites you up on stage.  He offers you two envelopes and tells you the ratio between the sums in them.  He says "the ratio between their value is r," where r > 1.  He gives you one envelope.  Then he says "want to switch?"  You might switch or not, judging what you think the speaker's motivation is, and what you think the speaker thinks of your motivation (reasoning about what he's reasoning is indeed something that a Bayesian actor should resolve, e.g., this Princess Bride poisoned-cup reasoning.  After you've satisfied (or confused) yourself about which envelope might be better, do you still want to keep switching forever? 

The paradoxical expected value:

An argument from expected value might indicate that you should switch: if your envelope is worth x coins, you stand to gain x * (r-1) or lose x * (1-(1/r)).  For example, if your envelope is worth 10 coins and the ratio between the envelopes is 2:1, then the other envelope is worth 20 coins or 5 coins.  By switching, you would gain 10 or lose 5.  If the odds of gaining and losing are just about equal (and, by symmetry, they odds that you have the less-valuable envelop might well be 1:1), then you benefit from switching. 

But symmetry implies indifference.

In the two-envelope problem, this indifferent strategy seems optimal: switch if you feel like it (e.g., if it's nice to chat with the speaker) and switch back again if you want (e.g., if you like to hear the audience laugh) and stop switching when it ceases to amuse you (the audience gets restless). The paradox is that you might believe that switching has no value (since the first envelope came to you randomly, and it could just as easily have been the other one) and that it has value (It would seem that by symmetry, the odds of increasing or decreasing your wealth are 1:1, since the rewards of increasing are greater than the cost of decreasing, the expected value is positive).

Prejudice helps; opening the envelop helps:

You can estimate the average unopened envelope's value from your prejudices about generosity, games, professors, conferences and money.  Look at the speaker's shoes... Check that it really is currency, and not a check, and figure out how much currency fits in an envelope, not a suitcase.  Think about whether it would have unduly inconvenienced the speaker to find deflated currency from a country which recently saw its currency lose value.  Does the speaker jealously watch the envelopes?  Does the speaker's briefcase have a handcuff on it, like a diamond-trafficker's.  Now just guess.  I guess ten coin because at this moment i guess the speaker would want to offer the minimal value which doesn't appear cheap.  Now open the envelope.  If I open it and see a bill worth 100 coin, I have to increase your estimate of the value of the other envelope up from 10 coin, but not up to 100 coin or more.  My new estimate is some average of the old estimate and the observed value -- 100 coin.  In any case, the expected value is between the old expected value and the observed value, so to switch away from a surprisingly good envelope has negative expected value (-EV).  On the other hand, if I open it and see a bill worth 5 coin, I expect the other envelope to contain something between 5 and 20 coin, so the same argument says I should switch from a surprisingly bad envelope.

Prejudice may be enough:

The expected value of switching from an envelope containing x to the other envelope is: x * (r-1) - x * (1-(1/r)).  The expected value of switching is the integral of this over all values of x.  We might hope that this sums to zero, even for unusual distributions of expected values of x.

Two-valued coins:

Suppose we live in a country which denominates its coins as 5,10, or 50.  The coins have one number written on the back and one on the front.  The coins are always printed 5-10 or 10-50 -- i.e., with 5 on one side and 10 on the other, or with 50 on one side and 10 on the other.  Suppose that the coins grow on trees, inside of flat nuts.  The shell (or husk or shuck) of the nut obscures the values written on the coins.  The value of a random coin depends first on which type of coin it is, and second, on which face is showing.  This country has a tradition of gathering nuts around old trees and laying them in long rows in wasteland.  The coins are not buried.  Cultivators simply lay them in rows, where they bloom into lines of trees.  Each cultivator usually owns a row, and they compare one row to another to see which strategies generate the most value.  The value of the coin, and then of the tree, is measured by the number of fruits produced.  A coin laid with the value x facing down will produce x fruits.  Agriculturists have not been able to find out how to generate better-quality nuts.  The two types of nut are generated with equal probability and with either face up.  Farmers enjoy eating the fruit of these trees, so any method to routinely increase the value of the nuts, the trees, or the rows of trees would add to their happiness.  

Some people can sense the value of the nut through the husk (or shell) with uncanny accuracy.  These children are profitably employed -- they go down a row of coins which have not yet germinated, and flip them so as to leave the better value up.  For some reason, these children never pick up the coin, examine both sides, and leave the coin in its preferred orientation -- they only look at the upwards-facing value, and for this reason three things can happen:

They see 5, and they know it is a 5-10 coin in its preferred orientation.
They see 50, and they know it is a 10-50 coin that should be flipped.
They see 10, and they don't whether it's a 5-10 coin or a 10-50 coin.

A winning strategy:

The professional switchers have settled on this strategy: when seeing 5 or 10, ignore it.  When seeing 50 flip it.  Proof: If a 5 is showing, then the value 10 is in the dirt, and this is the best-possible orientation for that coin.  If the visible value is 50, then the value 10 is in the dirt.  Flipping the coin creates 40 fruits worth of value.  When we see the value 10, the number facing down into the dirt is equally like to be a 50 or a 5, and the 40-fruit benefit of leaving the 50 facing down outweighs the 5-fruit cost of leaving a 5 facing down.  So it is best to leave the 10s alone.  The professional switcher simply looks for surprising good value going to waste and corrects only that sort of error.  A similar strategy applies, if the accuracy of the nut-reading is imperfect, for any information is better than none. 

Before opening the envelope:

We have a row of nuts laid out, and I can't read the values at all since I wasn't raised as a nut-reading savant.  Would we do better to flip all of our nuts?  We notice that the children flip 1 nut for every 3 they leave in place, so perhaps we should flip none of our nuts.  On the other hand, if we simply toss the nuts down in a random way, it seems clear that we can't expect to improve on the productivity of the row of trees by flipping every nut. 

It seems to me that this is the paradox: With information, you can improve the expected value of the line of trees.  Without information, you can't.  Indeed, the expected value of flipping a nut is:

    the value of flipping a 5, times the proportion of 5's,
+    the value of flipping a 10, times the proportion of 10's,
+    the value of flipping a top-value V, times the proportion of top-values.

If we suppose that coins A-T and T-V are equally likely, then the proportions of 5's, 10's, and 50's are 25% - 50% - 25%, and that calculation yields:

    -(T-A) times 25%
+    -(V-T)/2 + (T-A)/2 times 50%
+    (V-T) times 25%

The first term is the cost of flipping the A=5's into the earth; we plant A=5 instead of T=10 and lose T-A fruits.  This happens to 1/4 of the coins in the tree-line, which happen to be showing a A=5.  The second term is the cost of flipping the T's.  We plant T, and so we are sure to gain T fruits, but lose V=50 or A=5 with equal probability.  The third term is the benefit from flipping the V's.  The children are always glad to see a top-value coin and flip it into the ground.  The benefit of doing this is V-T fruits.  About 25% of the nuts in the tree-line are showing the top value V=50 before being flipped.

Now, when we add those terms together, -5/4 - 40/4 + 5/4 + 40/4 = 0, and you can see from the variables that the cancellation is algebraic: the costs and benefits of flipping 10s balance the costs of flipping 5s into the dirt and the benefits of flipping 50s into the dirt. 

Thursday, June 16, 2011

Gambling with the Rent

The working poor take a gamble on reduced rent: People who rent their homes and regularly lose at gambling seek smaller homes, allocating a significant portion p% of their losses to reduced demand for rental housing space.  People who rent their homes and suddenly win at gambling cannot seek to rent a larger home, since the winnings are not constant.  Therefore, they allocate much less than p% of their winnings to an increased demand for rental housing space.  Therefore, the gamblers as a whole demand less rental housing space and shift that demand to something else -- presumably consumption and investments.  Furthermore, many gamblers who win big will buy their homes or pay their mortgages or in other ways stop being a person who works, earns stable income, and pays a significant portion of that income in housing rent. 

Rental housing prices fluctuate more than most goods available to working renters.  Suppose working renters as a class allocate 10% of their income to gambling in one country, whereas they do not gamble in another country, where they pay r% of their income in rent.  The working gamblers earn W when they don't gamble; their landlords earn r% of W.  Where gambling occurs, the landlords earn r% of 90% of W, which equals r% of W minus p% of 10% of W.  This implies that p = r, which is consistent with the fact that r balances your marginal preference for a slightly larger apartment with your preference for other expenditures, whereas p balances your marginal preference for a smaller apartment with your preference to cut back on other expenses.  Working renters who demand 10% less beer will get 10% less beer.  But working renters who demand 10% less apartment space may find that landlords are not willing to leave too many apartments vacant.  The inflexibly large supply of rental apartments will cause the price of renting to fall until the renters are willing to take all the existing apartments (minus those which the landlords are willing to leave vacant) at the price which the renters are willing to pay.  The renters may end up paying 10% less for 2% less space.  That ratio 8%/10% -- when working renters pay 10% less in total, they get 8% more space for free -- measures the "inelasticity of the housing market."  We'll write n% for the "inelastic generosity" of landlords, it might be 80% in the short-term. 

Landlords subsidize gambling:

Thus, paradoxically, the working poor could spend a portion of their income gambling, reduce all their expenses, including reducing the price they pay monthly for housing, and it's about 80% true that their landlords will let them keep their apartments and pay the same rent.

Do winners win, or only the lottery?

We suppose that the biggest impact of gambling is to transfer wealth from losing gamblers to winning gamblers.  The remainder -- salaries, casinos and other infrastructure, taxes, charitable donations and profits, or whatever governments and lotteries buy with their money --  should be relatively small.  The transfer of wealth between losers and winners decreases demand for those products which losing gamblers buy and increases demand for those products which gamblers buy after they win.  Losing gamblers consume less, but consumption markets can simply contract.  But so far we have not considered the economic impact of winners.  Working renters will apparently see their losses reduced by their landlords (who pay only r% times n% of the losses -- maybe 30% times 80% or just under a fourth -- but who can complain about receiving a fourth of their losses back?) regardless of how they lose their money.

Landlord generosity, an example.

Rental markets for working people are investments in fixed resources -- for this market to contract, housing prices would have to fall to the point where some renters would buy their apartments outright.  But falling prices for small apartments and falling rents are the same thing.  If the number of apartments does not contract  (that is, if the landlords are all unwilling to leave any apartments empty in order to drive up prices), then n% is 100% -- the price of rent falls until the renters can again afford it.  Perhaps the landlords can changing the renters' preference p% for apartments over other goods and services slightly, but probably not by the entire value of the renters' loss of income.  One could imagine gamblers a,b,c,...z who rent houses A,B,C...Z, where the houses are ranked from the nicest to the worst.  If all the gamblers loose, they should all spend less on their housing.  gambler z moves out into the street; gambler y moves to apartment Z, and so on, leaving apartment A free. One gambler has won; gambler g moves out of the apartments all together.  Gambler g might rent apartment A (which is now vacant) or might buy (it).  The fact that one apartment now stands empty forces rents down until gamblers a,b,c,d,e,f,h...z move back into apartments B...Z, at lower rent.  In fact, if rent is the only fixed resource which the renters are consuming, then all their other consumption decreases should be met by contractions in those markets, causing all those prices to be fixed.  Only their rent will diminish, and their rents should be lowered by exactly r% of their gambling losses.

Economic rent:

It may seem unfair to the landlords that they pay for someone else's gambling habits, and pay at 80% !  Of course, if the landlords were collecting only the value of the stream of services provided by an apartment (those services including shelter, safety, rest, showers, space to cook and eat hot meals, etc.) then they would charge an inflexible price -- below this price, it is cheaper to them to leave the house vacant than to let you grind dirt into the nice floors, cause fire hazards with your candles, shoot up the place, burn the furniture, drive wheelbarrows through the hedges and operate a meth lab in the bathroom or whatever else turns your fancy but runs down an apartment.  The price of an apartment would equal the expected costs of upkeep, plus some interest on the investments (the furniture in a furnished apartment, in addition to the cost of upkeep, cost the landlord the chance to invest elsewhere, so the landlord very reasonably expects to get a return on that investment).  But, in addition, there is some "economic rent" earned by those with deeds to municipal property because land in a city is scarce, and since cities are profitable.  The fact that the city has become more profitable is information that the landlord only learns when commercial and housing rental prices per square meter rise.  It seems, from what I read in economics, that the price of rent includes economic rent on very valuable municipal property.  Economic theorists have proposed that a city which increases its property tax and decreases its income tax would free the marketplace of some inefficiencies (caused by income tax, but not caused by property tax, basically because income tax operates "at the margin" where it kills all transactions which would be marginally efficient without it).

Differential rent:  Do landlords capture workers' expected profits?

Suppose you get a job offer in a faraway city.  You could go there, rent an apartment, and do the job.  Or you could stay somewhere where you are already welcome, rent-free, and you could find the best job available there.  Perhaps your parents are tolerant, or perhaps you have exceptionally supportive siblings.  The expected profit from each choice has a certain profit, where profit is winnings minus expenses:

Take the job in the city: Profits are income minus taxes, pension, transportation, food...
Write your book: Profits are expected sales minus taxes and some contribution to your benefactor's budget.

If the landlords are the only ones in this story who own any fixed resource, then by a law of economics (that all profits go to someone who has a fixed resource) the profits accrue to the landlords.  The total rent extracted from each worker then should be that worker's expected profits in the city versus staying home, where profits are income minus whatever expenses are so dear to the worker that he would rather pay them than move out of his hovel.

Here we suppose that the economic rent on a worker's unique and personal abilities is one reason the worker earns profit (income above opportunity cost); the economic rent on land, which is another scarce resource, is the reason that landlords earn profit, and that if the landlords control access to the city labor market, their profits should be differential rent -- the profit you get working a city job minus the profit you get by writing your book.  The landlord extracts this differential rent by renting squares of land and cubes of air at a fairly constant price, but by varying the quality, so that the squares available in cheaper apartments are of such a terrible quality that no one who can afford a better place will dare to switch to a worse place.  Do they succeed in extracting differential rent?  If you never consider moving to a smaller apartment, but you do sometimes consider moving away from the city and writing your book, then perhaps they are succeeding in restricting your choices to the differential. 

What do winners buy?

It would seem that renters gambling causes a flow of money from slumlords to high-end landlords: everyone who loses loses a little and all rents diminish.  The winners invest some money, which drives up the value of fixed investments, such as property.  The winners consume some high-end products.
The market for these products expands.  This increases the demand for services and materials, increasing wages and paying the owners of natural resources.  I have argued for the utility of gambling and then investing your money, so I hope that winners buy investments, as did this investing lottery winner.


If landlords reimburse the working renters for their collective gambling, it will have been noticed that gambling depresses rental prices.  Studying the economics of gambling and residential property is complicated, however.  The most direct effect of making-gambling-legal seems to be the rise in high-end property values.  This might be completely driven by the development of large casinos.  Perhaps we need to look not at Las Vegas but at Oregon or the effect on the economy when home games of poker became popular. 

The article _Early impacts of limited stakes casino gambling_ by PT Long, 1996, states: "Although not true in all of the gambling communities, residential property in Black Hawk and Central City has experienced a substantial decline..." while a newspaper report on Macau property reports the rise of high-end property values through the story of an individual company: "Instead of casinos, the partners launched an opportunity fund in 2006 to invest in high-end residential, retail and commercial property in Macau to capitalise on the boom in gambling...  Riding on the back of the gambling boom in Macau, the value of the five assets owned by the trust rose..." and the Financial Times reported: in its article _Asia-Pacific - Macao tries to cool housing market_ on 29 Sep 2010 that "While gaming revenue rose rapidly, however, so did house prices. Residential property prices in Macao have increased almost a third..." and an academic review of the effects of gambling reports higher property values, without mentioning whether they are high-end or low-end, but mentions only business property: "higher property values lead to higher property taxes, which may make it more difficult for small business renters (though not necessarily for property owners)... Obviously, an increase in property values (and hence taxes) puts a squeeze on some operators, especially renters (property owning restauranteurs, even if they went out of business, reaped the gains of these property value increases), and again this is more a matter of distributional than aggregate impacts."  The obvious truth that casinos bring jobs, which increase rental prices is brought home by this commentary: A new industry like casino gaming may have jobs and increased income associated with it. These amenities will induce an increase in property values. 

Are economic rents lower in home-rental prices in gambling towns?

Consistent with these predictions, I checked rental prices in ZIP Codes with similar populations, similar population density, and similar distance to the center of a city of half a million people -- finding
cities of a similar size and general population density , finding similar postcodes in those cities, and
using a rent calculator to find prices.  The ratio of the average price for a rental house to the average home property value is lower in suburban Las Vegas than it is in Portland, which is lower than it is in Denver.

Zip codes: 89109 (Nevada) 97220 (Oregon) 80121 (Colorado)
Price per square meter for a 1BR house: 0.96, 1.07, 1.10
Price per square meter for a 2BR house: 0.86, 0.90, 1.02
Price per square meter for a 3BR house: 0.76, 0.90, 0.99
Price per square meter for a 4BR house: 0.71, 0.78, 0.81

I'm assuming that working renters in Las Vegas will gamble more than working renters in Portland, who will gamble more than working renters in Denver.  And while these cities have the same size in population, there are lots of differences between Las Vegas, Portland, and Denver other than a presumption that the working renters gamble more in one place.

Gambling for your class.

Widespread gambling makes a poor worker less profitable, but their rents would drop by a comparable amount.  The few who win large sums and invest them and earn interest might stay in the city -- and compete for apartment space with those who earn high profits -- or might move home and collect dividends on their gambling wins without paying rent at all.  They no longer compete with the rest of their peers to rent apartments of the same quality as they have been renting.  Those who win small prizes can consume these small prizes and enjoy them, without worrying that those small prizes will be confiscated as rent.  This is perhaps the best part of the scheme: while your losses do decrease your rent, your winnings do not increase your rent, because if you can't predict it, neither can your landlord.

Tuesday, June 14, 2011

Is utility unbounded?

 A thought-experiment.

If utility is unlimited, then an infinite amount of money, offered at any finite odds, is infinitely more desirable than any finite amount of money.  We would notice this effect in that infinite prizes in lotteries would be surprisingly tempting.  Are you currently stealing from your friends in order to play high-stakes lotteries?  Would you gamble everything to play the lottery if the jackpot were infinite?  If you answer "no" and "yes," then maybe your utility curve is unlimited.  People do play lotteries more when the jackpot rises.  This change suggests that people are able to manage their probabilities, make side bets, divide potential lottery winnings and otherwise make real sense out of quantities of money far greater than they have ever earned or consumed.  Apparently, when faced with a lottery which pays hundreds of millions of coin at thousand-million-to-one odds, a 50% increase in the jackpot increases the players' interest.  What would happen if the odds remained fixed at thousand-million-to-one but the jackpot were infinite?  We might prepare for the shock to the global economy and government as the world receives its monarch.  Would you, in addition, try to win?  Would you give up every comfort in your life to win?  Would you sell, steal, borrow?  Would you attempt grand crimes to obtain vast sums with which to win the lottery?  Would you abuse the trust of your family and friends?  If not, then you have put a value c on your own comfort and the good you can do as a good citizen, and you have put a value v on economic omnipotence such that

  v < a thousand million times c. 

v is the utility bound -- the net sum of all the good you could do and all the fun you could have with unlimited resources.  On the other hand, if you are reading this and if your conscience whispers "Yes, I would beg borrow and steal for a chance at v" then you have valued v > a thousand million times c.  That wouldn't prove that utility is unbounded.  To prove that utility is unbounded, we should give you worse odds, such as a trillion-trillion to one, and better comforts, say c' = all the good you can do and all the fun you can have when you step into the role of someone -- anyone -- who you think seems to have a lot of fun and/or do a lot of good.  If your heart knows that would happily give up all that person's pleasures and good work in order to ruin his life with your gambling addiction, then you believe that

  v > a trillion trillion times c'. 

We could check that v exceeds any finite bound by setting the odds arbitrarily low.  A person who believes in unbounded utility offers to behave badly if he or she believes in the existence of an infinite lottery.  Such a person would happily suffer any finite setback and cause any finite amount of damage in exchange for a chance at winning v.  Perhaps your neighbors are such people; your neighbors act normally because they do not believe in an infinite lottery and they are unwilling to behave badly in order to win the kinds of lotteries which they are offered.  If the neighbor is not stealing from you and buying lottery tickets with the stolen coins, it is because the neighbor values

a lottery jackpot * lottery odds < the cost of stealing a coin . 


It's a strange value in cost-benefit analysis because it continues to have its value ay any odds, and it outweighs any finite cost.  We avoid this by believing in no infinitely-valuable properties.  If a thought experiment asks us to accept that something (the prize in a lottery) might have a value infinitely greater than anything else in the universe, we can reject that notion with the Archimedian principle that everything is comparable to everything else -- all pains and pleasures, goods and and evils are comparable to each other.  Or, if some things are infinitely better than others, we can ask whether there is a world of top goods which are all comparable. -- perhaps my own good citizenship has far-reaching and "infinite" consequences.


Someone who says "utility is unbounded" can depress the utility(coins) function to the point where we cannot tell the difference between his bets and those of someone who believes utility to be bounded by examining data-points with realistic expected values.  You can fit to a scatter-plot of observations functions which are bounded or unbounded.  Maybe we can find some data about willingness to play the lottery for very high values, and this would show the upper end of the utility curve.  The surprising result seems to be that people will play a lottery for low expected value if the prize is high, suggesting that they value the marginal coin more than a coin in the hand; that makes sense to me only in terms of investing that coin, not consuming it.  I wrote about consumption, investment, and gambling when discussing the utility of gambling.

Friday, June 10, 2011

The Utility of Gambling

People gamble.

Economists assume people use their money rationally.  If a simple model suggests that a common behavior is a losing strategy, economists seek an extended model which elucidates the behavior.  If the extended model makes predictions, these should be tested against data.  Gambling "violates stochastic dominance," which is the usual argument: "If the expected payoff is 96% of the bet, then playing is irrational."  Gambling suggests that the utility of money is increasing, whereas in many surveys the utility of money is seen to be decreasing.  An economist might predict that when the utility of money is increasing for a rational person, then that person will gamble.

Gambling suggests that ROI increases with wealth.

Warren Buffett makes a better ROI on his investments than I do.  If all of that advantage comes from his brains, then I can't copy him.  But if part of his advantage comes from being rich already, then I should gamble.

ROI model:

Some investments generate an income stream for the owner.  Stocks and bonds pay dividends.  Owning a house near where you work generates an income stream -- you don't have to rent a house, and if you take a long vacation, you can rent it out.  Suppose the investment opportunities I would face after winning a gamble are better than those which I face now.  To take a simple model, suppose that an individual with wealth w can find (by using her spare time to learn about new business ventures, or because she can afford to diversify into risky ventures without risking her daily quality of life, or because large sums of wealth are slightly easier to manage, or because she can put some of it into long-term investments and she doesn't have to keep it in a cash account) earns interest rate 2+log(w) on her wealth.

If the Anderson family invests 1 coin and compounds it 100 times at that rate of interest, they will then have earned 27.71 coins.  The Bakers and the Cooks follow this strategy for 10 days.  At that point they get "anxious" to reach their savings goal.  They take a gamble, to win 0.1 coin or lose 0.1 coin with odds of 51:49 in favor of losing the coin.  I.e., they accept odds slightly worse than 50:50, so that the casino could make a profit.  The Bakers lose and the Cooks win.  They then proceed to compound their money 100 times at the variable rate of interest.

That is... the Andersons apply the following iteration 100 times:

wealth = wealth * (1 +  0.01 * (2 + the natural log of their (wealth) ))

The Bakers and the Cooks do likewise, but after compounding their initial 1 coin 10 times, they then gamble 0.1 coin -- the Cooks add 0.1 coin to their wealth and the Bakers lose 0.1 coin.  The results, after compounding 100 times, is:

A: 27.713150
B: 22.654267
C: 33.364858

The expected wealth of those who follow the Bakers-Cooks strategy is: (B * 0.51 + C * 0.49) since there are 51 losers to every 49 winners in the lottery.  That value exceeds the result realized by A.

Of course, the Cooks and Bakers can do much better if they would bet with each other.

Saving towards a goal.

In real life, people often gamble so as to "make up the difference" between their savings and a desired investment goal -- a house or a business, for example.  When you have a goal in mind, and when you expect that goal to increase your quality of life and generate an income stream with a better ROI than the investments you already own, is it rational to gamble your savings and take a chance on securing the new, desired investment either earlier or later than you would expect?

An example -- brothers buying houses.

Suppose my next purchase will be a house and that I want to buy it with cash so that I will have neither rent nor mortgage payments coming due monthly.  This investment thereby represents an income stream to me.  I can wait for the cash to accumulate and then buy, and then begin to enjoy the utility and income stream from the purchase.  Or, when I have some portion of the money and find a convenient opportunity to buy, I can gamble and stake my savings against the money needed for the investment.  If I win, then I get the income stream early, with its increased ROI.  If I lose, then I get it later.  The average of these two conditions is better than simply waiting.  Suppose that when I start work, my income stream is $50 per day (above expenses).  Suppose that at this rate it will take me 20 years to save the money to buy a house and that I earn no interest on my savings.  Suppose that, having got the house, my income stream will be $150 per day.  After working for 10 years, my brothers gamble.  51% of them lose everything -- their savings are now 0 and they continue to earn income at $50 per day.  On the other hand, 49% of them get their house and now earn income at $150 per day.  My brothers' average income stream is now greater than my own.  The gamble was similar to a pact -- they could have decided to pool their money and get houses, one by one, as they were able to do so.  Gambling and pacts are efficient if ROI is higher for greater sums of wealth.  The Teachers' Credit Union may be a pact allowing teachers to earn ROI available to their total wealth, rather than the ROI available to their individual wealth.

A friendly game of poker:

If you want to buy a house, then join my poker club.  When we have enough savings, between us, then we will play poker (or, if someone is too good at that game, we'll play something more random) until we all lose our small investments and one person has all the money and goes to buy the house.  We could have done some complicated thing where we give him the money and then force him to continue to contribute, but gambling makes this more simple.  If there are only 3 of us and the cost of a house is 6*x, then one of us will save 2*x and then get the house (and the income stream!); another will save 2*x and lose it and then save 3*x and get a house; the third will lose 2*x, lose 3*x, and then save 6*x and get the house.  Together, we saved the price of three houses and some of us got the income stream early.  If ROI is equal for the rich and the poor, then it was all a game.  But if the ROI we get from owning a home is better than the ROI we earned on the savings while we saved it, then it is clever to lose your savings so that another you can close the deal on the good life.


In a lottery, the payoff can be very low.  Considering that the lottery winner pays tax and that the lottery payoff may be only 50%, the payoff might  be 25%.  That is still rational if the ROI on winnings is more than 4 times greater than the ROI the player was making the money which the player gambles.

The St Petersburg paradox; a resolution in favor of gambling

Utility companies buy and sell energy.

When money runs out, we can ask to be paid in joules of energy, or Kilowatt-hours.  If we are paid more energy than we can use, we can use a few joules to build machines which could use the total sum.  Is there utility in smashing protons?  Utility in interstellar travel?  G8 countries consider these things to have positive utility.  Engineers and physicists can propose ways to use any amount of energy.  Imagine if the public could fund big science via a fixed-price lottery with infinite expected return on investment!  I would vote to buy that investment. I conclude that infinite and finite versions of the St. Petersburg lotteries will attract players who want to spend some part of their current utility on a chance to visit Alpha Centauri (e.g., via Project Orion).  Thus, like any lottery with giant payouts:

The St. Petersburg lottery will have players at any price which players can afford.

Combinatorics matters:

Consider two games: The Petersburg lottery "independent-style" and the Petersburg lottery "guaranteed".  Game PG ends when you win any prize, and you are guaranteed to eventually win some prize.  Game PI never ends.  Winning one prize does not stop the game.  Suppose one prize is revealed each year.  Game PI is, clearly, worth one coin per year to play... for, each year, the expected winnings is 1 coin.  With game PG, in which the prizes are disjoint, exclusive, and guaranteed, has the same expected value, much less risk, and is guaranteed to leave you feeling an infinite regret.

Infinite regret:

The independent lottery is manifestly fair: each year it pays out an expected one coin, equal to your one-coin investment.  The Bernoulli "guaranteed" lottery is different: each year you anticipate an even bigger reward.  But you know in advance that one year you will win.  After n years of paying 1 coin per year, you will then earn 2^n coin.  You could spend half of your earnings every year.  n years later, you would be bust.  You would then continue to pay forever, 1 coin per year, for the pleasure of having played, and won, the Bernoulli lottery.  This is why Bernoulli's lottery is a paradox: the independent lottery is fair.  Bernoulli's lottery has the same expected payoff, lower risk, and yet you are sure to regret having played it.  If we consider a payment scheme other than "one coin per prize," we find that if the sum of the payment scheme (over all time) is finite, the casino's expected loss to the player is infinite; if the payment scheme is infinite, the player's expected loss is infinite. 

On the other hand, if the player could invest 2^n coins and pay the infinite cost of the game from the interest earned, then things are different.  Of course, the value of a lottery depends on what other investments are available -- competing for the price of the lottery, and available for investing the winnings.

Bernoulli's paradox:

The St. Petersburg lottery is completed-infinite, and you are guaranteed to win one prize.  Reasoning rationally, the kind of person who is happy to play a lottery for tremendous payoffs in joules would much prefer to play PI than PG.  But PG is offered.  The player who will sign the infinite contract -- to pay 1 coin per year for the pleasure of having played -- will end up with n joules of investment and 2^n joules of largesse, followed by infinite regret.

Bernoulli succeeded in showing that there is another variable to be considered when comparing games... not only expected value, and not only risk, but something else, which can be coded into the combinatorics of the game.  By combinatorics, in this case, I mean the condition that you can't win more than one prize -- the condition that the prizes are disjoint.

Thursday, June 9, 2011

The St.. Petersburg paradox

The St. Petersburg lottery:

... is a game of chance which costs c coins to play and pays back to the player exactly one of the following prizes: 1 coin with probability 1/2, 2 coins with probability 1/4, 4 coins with probability 1/8, 8 coins with probability 1/16, etc.  The sequence of prizes 1,2,4,8,16,32 lists the powers of two.  The sequence of probabilities 1/2, 1/4, 1/8, 1/16 is chosen so that the sum of all probabilities is 1.  I discuss the importance of this -- that the prizes are exclusive, and one prize is guaranteed extensively in my next post.  Without this condition, I believe that the paradox can be resolved.

The mechanism of the lottery could be as follows: a fair coin is flipped.  If it lands on one side, the pot is doubled.  If it lands on the other side, the pot is paid to the player.  The St. Petersburg paradox is that there seems to be no good answer to the questions: What cost c will attract players to the lottery and what costs c will attract a casino to offer this game?

The paradox:

Informally, the paradox of the St. Petersburg lottery is that no one wants to play it -- neither as the casino nor as the player.  Formally, the paradox is that the expected payoff to the player is infinite, so the player should be willing to pay an infinite sum, in order to be able to play the game.  On the other hand, the player will surely receive only a finite payoff on each game; a player who paid an infinite sum cannot possibly win it back or ever profit from the game.

The game violates the notions that you can put a price on anything and that a game is favorable to one player or to the other -- this game scares off both players.  For many games, we can vary the cost, and we find that for high values of c, a casino would happily offer the game and win consistently; for low values of c the player would have the advantage and win consistently, and that for many values the casino is happy to offer the game and win consistently, while some players are happy to play.

The "winnings - odds lemma" :

Consider a simple gamble in which a player stands to lose a or win b with odds p:q.  The "lemma" is that the game will find players if q*b > p*a.

For a hypothetical player H who wins q times and loses p times, the odds equal the ratio of losses to wins; this player would win q * b and incur losses of p * a.  When p and q are small natural numbers, a real-life player may find that his experiences amount to the repetition of H's experiences, plus a random variation, the monetary value of which grows more slowly than the value of repeating H's experiences (more or less as the square root).  When p and q are small, the game will have players when q*b > p*a.

Utility and lotteries:

What happens as the odds against winning become extremely large?  Present-day lotteries find players for games for which q*b = 0.5 * p*a.  For instance, the lottery may offer million-to-one odds of winning half a million.  We could call this the Lottery Paradox.  It is a cliché that such a lottery imposes a "tax on bad math skills."  But economists insist of believing that people in society are in fact playing winning strategies, and that when the strategy seems senseless, then we should first attempt to find an explanation for the behavior, before giving up the assumption that people act rationally.  To play million-to-one odds of winning half a million could be rational if money had increasing utility, or if people viewed small possibilities in a "hopeful" way.  On the other hand, the correspondence of early mathematicians suggests that they expected money to have a decreasing utility, or that people would view small possibilities with distrust (

The "expected value" lemma :

A player should play a complicated game if his expected winnings exceed the expected losses.  A special case of this lemma is the winnings - odds lemma stated above. 

Correspondence (my selection and paraphrases from the source cited above):

Montmort (Paris): [Despite the paradox] I cannot resolve to abandon our lemma, which must be generally be true.  (The lemma being that when expected winnings exceed expected losses, the game is favorable to the player.)

Cramer: Utility is finite.  Wealth above 2^24, or 16,777,216 coins is all the same to a player; this alone shows the game is not worth 13 coins.  And if the utility of wealth varies as the square root of that wealth, then the game is worth no more than 3 coins.

N Bernoulli: Perhaps gamblers ignore, and should ignore, very small probabilities.  If players all suppose that odds such as 31:1 are as good as 32:0, then they would play the game for 2 and a half coin, and no more.

Daniel Bernoulli: The "expected value" lemma ignores Risk.  The masters of the Bernoulli clan lost all to the bankruptcy of Mr. Muller.  Chasing the best expected return would not have prevented this.  I have written a monograph on Risk and I am sending it to you.  This monograph explains how to avoid disaster when investing.

Nicolas Bernoulli to Daniel Bernoulli: Thanks for sending me your manuscript.  Cramer wrote to say that a millionaire gains little from an additional thousand.  You point out that a family accustomed to millions might lightheartedly win or lose thosands, but should be less light-hearted about losing "all."  Yes, the expected value of 1000 coins, invested with 9:1 odds against bankruptcy, is 900 coins; the expected value of those coins, invested in two businesses, each with 9:1 odds agaisnt bankruptcy is the same.  It is very true and we know it without paying attention to your principle, that one does nonetheless better to place 500 coins in 2 places, than 1000 coins in a single place, since the chance of losing all is then 1 in 100, rather than 1 in 10.  But you would not have done better if you had been in charge of investing our money, seeking to earn interest on it, and without the chance to divide it into small pieces.

The limits of real money:

A player could play with money borrowed from many investors.  Hundreds of millions of investors can share ten trillion (ten million million) coins as dividends of a hundred thousand coins each.  Such dividends are low enough that individuals can easily comprehend how to spend them and get utility.  Windfalls like this occur in my lifetime to the citizens of small nations with stable democracies when tremendous natural resources are discovered there.  Even the trickle-down effect of a successful banking enterprise in a small country can be worth far more to its citizens than a single windfall of a hundred thousand coins.  In this way, each additional coin is enjoyed by one of the investors in such a way that the coin can have high utility -- it is no one person's ten-trillionth coin.  Of course, Cramer could have imagined this.  In 1713 the population of the globe exceeded half a billion people.  Personal investment and massive transfers of wealth between nations occurred (less than 100 years later, the Louisiana Purchase was valued at tens of millions of dollars or francs).  Utility does not so much diminish as become complicated, involving more people, so long as the winnings and losses in question do not exceed the net aspirations and wealth of the poor of the earth.  We can imagine sums of money comparable to the world's net wealth.  Europe could trade wealth with China -- we can imagine transfers of this magnitude.  Should China play odds of a trillion to one to win 1.1 trillion from the USA?  If the payouts were capped at a trillion, then we've raised the value of the truncated game slightly -- from 12 to 20 coin-- where the truncated game offers all the prizes from 1 to 2^n, with the same probabilities, but it offers none of the larger prizes.

Simple winnings and odds, for nations : 

Consider this simple gamble in which the player stands to lose 1 coin or win 1.1 trillion coin, with odds of a trillion to one.  As this is a finite game, our prejudices become theorems: whatever the relationship between utility and money, and however we hopefully overestimate or underestimate probability, there is a certain price at which this game is fair.  Below that price, it is to the player's advantage, and above that price it is to the casino's advantage.  As ambassador, I probably would accept advantageous odds to spend pocket change and possibly win part of the global GDP for your nation. 

Dream big.

With the right bet, you have a chance to satisfy a child's rhetorical request for a Christmas present such as "let everyone eat enough food this year" -- if you won the right to manage this year's global GDP, you could scrap every piece of military hardware produced this year and lay them as the foundation for an awesome scuba-diving playland.  That's probably not your child's dream, but if you gambled and won "the rights to everything produced this year in the world" you could make a very strange dreams come true.  We all hope you tread carefully on the economy during your year of absolute control!

Tell me the way to the St. Petersburg Casino!

I would definitely pay 40 coin to play at the same time each of the lotteries "2^n : 1 odds of winning 2^n" for n from 1 to 40.  I would also pay 40 coin, the fair market price, to play the finite St. Petersburg lottery with 40 coins.  I would like to buy many of these and sell the separate lotteries to different users.  The small lotteries I would roll back into the game.  The million-coin lotteries, on the scale of a massive government-protected jackpots I would re-sell at a profit.  The larger lotteries make wonderful Christmas gifts to people with big dreams; I'd attach a few coins to some of my own dreams.
Clean Meter for
Click to verify