Thursday, July 7, 2011

Second Eggs

I read Tim Harford's book Undercover Economist; in the second edition is an argument about
mortgage-backed securities and further derivatives.

The author must have included this on the notion that some financial instruments which were involved in the world's recent financial crisis are similar to "second eggs."  He mentions that mortgage-backed securities were derivative on mortgages being generally safe, but that further derivatives were taken as well.  And so he suggests we consider a toy model in which you are offered an investment which we'll call an egg.  The investment will probably deliver value.  On the other hand, it might become worthless.  We estimate that eggs go bad with probability p.  From eggs, we get an investment called "second eggs" which names a "carton" of 6 eggs and pays off if the number of bad eggs turns out to be 0 or 1.  From e second eggs, we can of course get a carton of second second eggs, and so on.  

This emphasizes that while p represents risk, estimating p wrong is also a risk.  And yet, a basic notion in probability theory is that we have to guess something and then reason to a conclusion.  We have to guess p, and then see what the consequences are.  Or we can guess a distribution for p and see what the consequences are.  But what are the consequences of guessing the wrong distribution?  Probability theory should hope that the recursion stops here... that a distribution of distributions for p is not any different from a distribution for p.  And we could hope that there are only two sorts of error:

-- the probability p that an investment goes sour;

-- the error in estimating p; call it p'.   

He points out that if eggs go bad with probability p, then "the second egg to go bad" is an investment which goes sour with probability q.  He then points out two interesting things: if p is below 5%, then q < p.  I.e., the second bad egg in a batch of six is a safer bet than buying a single egg.  On the other hand, if p is 10% or more, then q > p.  Supposing that we no extra information about second eggs than that they are ... second eggs... then to estimate q we must minimize p'.  If we can't guarantee that p < 5%, then we can't guarantee that q < p.  I made this table just to check.  In the left column, the probability that one egg is bad is 5%; in the right column it is 10%.  Then we calculate the probability that the number of eggs which are bad is > 2 in a batch of 6. 






Example from Tim Harford book Undercover Economist, version 2.
batch six
eggs, and rank them with the worst ones first (perhaps this is revealed
later).
probability
that one egg is bad:
0.05 0.1
distribution
of the number of bad eggs:
0 0.735091891 0.531441
1 0.232134281 0.354294
2 0.030543984 0.098415
3 0.002143438 0.01458
4 8.46094E-05 0.001215
5 1.78125E-06 0.000054
probability
that the first egg is bad:
0.264908109 0.468559
probability
that the second egg is bad:
0.032773828 0.114265
probability
that the third egg is bad:
0.002229844 0.01585
probability
that the fourth egg is bad:
8.64063E-05 0.00127
probability
that the fifth egg is bad:
1.79688E-06 5.5E-05
probability
that the sixth egg is bad:
1.5625E-08 1E-06
 (The author mistakenly rounds 0.26490810 to 0.27.)
batch six SECOND eggs, and rank them with the worst ones first
(perhaps this is revealed later).
probability
that one second egg is bad:
0.032773828 0.114265
distribution
of the number of bad second eggs:
0 0.818781906 0.482862
1 0.166463346 0.373752
2 0.014101255 0.12054
3 0.000637082 0.020734
4 1.61903E-05 0.002006
5 2.1944E-07 0.000104
probability
that the first egg is bad:
0.181218094 0.517138
probability
that the second egg is bad:
0.014754748 0.143386
probability
that the third egg is bad:
0.000653493 0.022846
probability
that the fourth egg is bad:
1.6411E-05 0.002112
probability
that the fifth egg is bad:
2.20679E-07 0.000106
probability
that the sixth egg is bad:
1.23926E-09 2.23E-06
batch six
second second eggs, and rank them with the worst ones first.
probability
that one second egg is bad:
0.014754748 0.143386
distribution
of the number of bad second second eggs:
0 0.914673513 0.395103
1 0.082187318 0.396811
2 0.003077034 0.166053
3 6.1441E-05 0.03706
4 6.90092E-07 0.004653
5 4.13385E-09 0.000312
probability
that the first egg is bad:
0.085326487 0.604897
probability
that the second egg is bad:
0.003139169 0.208086
probability
that the third egg is bad:
6.21353E-05 0.042033
probability
that the fourth egg is bad:
6.94236E-07 0.004973
probability
that the fifth egg is bad:
4.14417E-09 0.00032
probability
that the sixth egg is bad:
1.03176E-11 8.69E-06

Friday, June 24, 2011

Two-Envelope Paradox II

This post continues a discussion of the Two-Envelope Paradox which has been going on at 
Res Cogitans blog.

Suppose you enter a game show where two envelopes are on display.  You are told that envelopes contain gameshow points - points are translated into currency at a constant (linear) rate c at the end of the show.  You are initially asked whether you prefer one envelope to the other.  They are blank and identical, so you do not.  You are then shown the value in one envelope -- 2000 gameshow points.  At this point, you might reason:

1. as you don't know c, there is still no point in swapping.
2. if there is no point in swapping, then the expectation value of swapping is 2000.
3. if the expectation value is 2000 then 2000 = p.1000 + (1-p).4000
4. the probability of swapping and getting 1000 is 2/3.
5. this calculation holds whatever value was in the first chosen envelope.

  i.e., p(the other envelope is worse | your first envelope's value is X) = 2/3 for all conditions x assignign a value X to the first envelope.

6. when given the initial choice of envelopes you have a 2/3 probability of choosing the higher amount.

0 and 6 clearly contradict.  Where did the reasoning go wrong?

I believe it went wronge here: 6 does not follow from 12345.

From p(y|x) for various x we want to infer p(y).  For let y be the case "the other envelope is better," and let x be the condition "the value in the envelope is X."  If we had an integrable prior distribution of envelope values, we could reason using the "law of total probability" that p(y) = sum_x p(y|x) dX.  When the distribution of X is not integrable, can we reason according to a law such as the following?

Law?: if p(y|x) is constant as x ranges over all possible conditions, then the total probability p(y) takes that same value.

When X is not integrable, we can gather conditions x into pairs and find that the probability p(y|pairs of x) is twice as good; then p(y) would take on a larger value by that same law.  That law is not valid.

Arguments 1..5 tease out the consequences of "indifference." I don't think 1..5 are logically necessary at all.  They make a compelling argument about the implications of indifference: what to do if you don't know what gameshow points are worth to you, and you only know that more points are worth more to you. 
I think that once you make the assumption that utility is linear in gameshow points and that p(y|x) is constant , then I think the chain of reasoning 1..5 is correct, leading inexorably to the conclusion 5. But one could go on reasoning after 5, reaching problematic conclusions about small amounts of money and for large amounts of money.  I will try to do that now.

7. Arbitrarily small values exist in the game.  Whatever values we see, we should have expected smaller ones.

What is the domain of the gameshow points?  Is there a minimal value?  1...5 constrains the distribution of envelopes so that for all x a value of x/2 is always twice as likely as 2x.  So, there is no minimal amount of money in this game, and the probability of tiny amounts outweighs the probability of large amounts.  If the first envelope contains 0.01 gameshow points, then I might still switch down without losing expected value because in fact the smaller values are more likely.  There is no "gameshow cent."  1 is rarer than 0.1, which is rarer than 0.01, which is rare than 0.001, etc., so the treatment of small values is not going to come up only rarely.  Perhaps part of the paradox is being pushed into the gap between 0 and arbitrarily small values.  As a result, gameshow points cannot translate into real money.  We could suppose that there is a constant c, such that gameshow points times c = currency in your hometown.  But currency in your hometown doesn't exist in arbitrarily small amounts.  So there has to be a minimal gameshow value.  But by arguments 1..5 there is none.

8. The game show is obtaining its values in some way other than choosing them from a distribution. 

In order to compute p(y) from  p(y|x) using the law of total probability and in order to assign probabilities, we should form an integrable prior distribution.  This will simply violate 1: the notion that you don't know what gameshow cash is worth, so you will treat all values the same.  Of course, you can resist forming a prior distribution, and still reason... but then an "implied distribution" is still present behind your actions.  The implied distribution seems to be x^(-0.5), since for that distribution the chance of switching down is twice the chance of switching up.  This distribution is not integrable.  Not only does it put tremendous weight on small values, it effectively puts nonzero weight at very large values (or at infinity).  if we would cut this distribution off at any arbitrary top and bottom value, we would find a very interesting fact: you *should* switch up from small values, but for top values you should not.  How is the game show supposed to generate its random envelope values?  It must have, in fact, an integrable distribution.  More likely, it has a bucket full of envelopes -- a discrete distribution.  The customer may well not know it, but some such distribution must exist in order to create the game show.

Tuesday, June 21, 2011

Wallet Wrongfully Returned

Losing your wallet at an economics conference:

You go to an economics conference.  The speaker takes wallets from two members of the audience and offers to combine the total and give it to whomever had the lesser sum.  You stand to gain the larger sum (if you had the smaller) or lose the smaller sum (if you had the larger). 

You should expect a certain distribution of wallets.  You should be willing to play the game if :

sum-of-richer-wallets > your wallet * number of poorer people.  The median person in the audience should be happy to play this game --  if there are (R-1)/2 people richer and (R-1)/2 people poorer, then the first sum exceeds the second sum, since each richer-wallet is richer than your wallet.  But of course, the audience as a whole doesn't get richer if we play... just the median person.

Can we design a group of players for whom all of them should play except the top one?  Yes, of course: we give the poorest person 0.  The next person has 0 too.  If we ever rise above 0, then that person's rank, from least to greatest, is r.  This first nonzero wallet value is 1, say.  The next wallet must have value r, the next r(r+1), and so on.  A super-exponential distribution of wallets insures that everyone but the richest person is willing to play.  But this is hardly a paradox... if the richest person is carrying an industrial nation's wealth in traveller's checks, this character might *strongly* resist playing this game with a bunch of people with relatively empty wallets.

Suppose there are (R-1) other people in the audience. 

We choose another member of the audience randomly from R-1 people. 

You should begin by guessing a distribution of wallets.  Just guess some numbers and assign them to the audience.  Calculate for each person his or her rank r in the list of wallet-values w, ordering people from the richest to the poorest.  Let R be the maximum value of r, i.e., the number of people.  Let c be the cumulative sum of the wallets so that to each rank r we assign c(r) = the wealth in the biggest r-1 wallets.  Let C be the sum of all wealth.  The person with rank r has R-r ways to lose w and (r-1) ways to gain an expected c/(r-1).  The expected benefit to the player with rank r is: c/(R-1) - w * (R-r)/(R-1).  If w is normally distributed with respect to r, then the majority of players expect a small benefit, but a few players expect a small benefit. 
Or if R = 10 and the wallet-sizes are 5,4,4,4,3,3,3,2,2,2, we can arbitrarily choose which 4 should lose to which 4, and get a linear order:

 [r]  1  2  3  4  5  6  7  8  9 10 -- rank.
 [w]  5  4  4  4  3  3  3  2  2  2 -- wealth or wallet.
 [c]  0  5  9 13 17 20 23 26 28 30 -- what you can win; the sum of the wallets from people who will have to pay you.

If I am the fourth-highest wallet-holder then with chances 3/9 I will win 13/3.  I.e., my expected winnings are 13/9.   With chance 6/9 I will lose 4.  I.e., my expected loss is w/(R-1)...




Tim Harford in his book _Undercover Economist_ mentions the following twist:  We now action the prize w1+w2 to the two owners of the two wallets.  What is their winning strategy in an auction (with English-auction rules)?  Each player makes an estimate of the other player's wallet: e1 and e2 are the estimates of w1 and w2.  Now player 1 expects the value to be w1+e2 and player 2 expects
the value to be e1+w2.  If player 1 bids w1+e2 and player 2 drops out, then player 1 pays w1+e2 to earn w1+w2.  He has paid e2 to win w2, which he perceives as a random variable with expected value e2.  The information available to him in the action -- that player 2 continued bidding until w1+e2, can only improve his estimate of e2.  Thus, each player should be happy to obtain the
other player's wallet at its expected value.  Perhaps they should bid a little higher in order to take into account each other's information.

Monday, June 20, 2011

The two-envelope problem

Suppose you go to a conference where the speaker invites you up on stage.  He offers you two envelopes and tells you the ratio between the sums in them.  He says "the ratio between their value is r," where r > 1.  He gives you one envelope.  Then he says "want to switch?"  You might switch or not, judging what you think the speaker's motivation is, and what you think the speaker thinks of your motivation (reasoning about what he's reasoning is indeed something that a Bayesian actor should resolve, e.g., this Princess Bride poisoned-cup reasoning.  After you've satisfied (or confused) yourself about which envelope might be better, do you still want to keep switching forever? 

The paradoxical expected value:

An argument from expected value might indicate that you should switch: if your envelope is worth x coins, you stand to gain x * (r-1) or lose x * (1-(1/r)).  For example, if your envelope is worth 10 coins and the ratio between the envelopes is 2:1, then the other envelope is worth 20 coins or 5 coins.  By switching, you would gain 10 or lose 5.  If the odds of gaining and losing are just about equal (and, by symmetry, they odds that you have the less-valuable envelop might well be 1:1), then you benefit from switching. 

But symmetry implies indifference.

In the two-envelope problem, this indifferent strategy seems optimal: switch if you feel like it (e.g., if it's nice to chat with the speaker) and switch back again if you want (e.g., if you like to hear the audience laugh) and stop switching when it ceases to amuse you (the audience gets restless). The paradox is that you might believe that switching has no value (since the first envelope came to you randomly, and it could just as easily have been the other one) and that it has value (It would seem that by symmetry, the odds of increasing or decreasing your wealth are 1:1, since the rewards of increasing are greater than the cost of decreasing, the expected value is positive).

Prejudice helps; opening the envelop helps:

You can estimate the average unopened envelope's value from your prejudices about generosity, games, professors, conferences and money.  Look at the speaker's shoes... Check that it really is currency, and not a check, and figure out how much currency fits in an envelope, not a suitcase.  Think about whether it would have unduly inconvenienced the speaker to find deflated currency from a country which recently saw its currency lose value.  Does the speaker jealously watch the envelopes?  Does the speaker's briefcase have a handcuff on it, like a diamond-trafficker's.  Now just guess.  I guess ten coin because at this moment i guess the speaker would want to offer the minimal value which doesn't appear cheap.  Now open the envelope.  If I open it and see a bill worth 100 coin, I have to increase your estimate of the value of the other envelope up from 10 coin, but not up to 100 coin or more.  My new estimate is some average of the old estimate and the observed value -- 100 coin.  In any case, the expected value is between the old expected value and the observed value, so to switch away from a surprisingly good envelope has negative expected value (-EV).  On the other hand, if I open it and see a bill worth 5 coin, I expect the other envelope to contain something between 5 and 20 coin, so the same argument says I should switch from a surprisingly bad envelope.

Prejudice may be enough:

The expected value of switching from an envelope containing x to the other envelope is: x * (r-1) - x * (1-(1/r)).  The expected value of switching is the integral of this over all values of x.  We might hope that this sums to zero, even for unusual distributions of expected values of x.

Two-valued coins:

Suppose we live in a country which denominates its coins as 5,10, or 50.  The coins have one number written on the back and one on the front.  The coins are always printed 5-10 or 10-50 -- i.e., with 5 on one side and 10 on the other, or with 50 on one side and 10 on the other.  Suppose that the coins grow on trees, inside of flat nuts.  The shell (or husk or shuck) of the nut obscures the values written on the coins.  The value of a random coin depends first on which type of coin it is, and second, on which face is showing.  This country has a tradition of gathering nuts around old trees and laying them in long rows in wasteland.  The coins are not buried.  Cultivators simply lay them in rows, where they bloom into lines of trees.  Each cultivator usually owns a row, and they compare one row to another to see which strategies generate the most value.  The value of the coin, and then of the tree, is measured by the number of fruits produced.  A coin laid with the value x facing down will produce x fruits.  Agriculturists have not been able to find out how to generate better-quality nuts.  The two types of nut are generated with equal probability and with either face up.  Farmers enjoy eating the fruit of these trees, so any method to routinely increase the value of the nuts, the trees, or the rows of trees would add to their happiness.  

Some people can sense the value of the nut through the husk (or shell) with uncanny accuracy.  These children are profitably employed -- they go down a row of coins which have not yet germinated, and flip them so as to leave the better value up.  For some reason, these children never pick up the coin, examine both sides, and leave the coin in its preferred orientation -- they only look at the upwards-facing value, and for this reason three things can happen:

They see 5, and they know it is a 5-10 coin in its preferred orientation.
They see 50, and they know it is a 10-50 coin that should be flipped.
They see 10, and they don't whether it's a 5-10 coin or a 10-50 coin.

A winning strategy:

The professional switchers have settled on this strategy: when seeing 5 or 10, ignore it.  When seeing 50 flip it.  Proof: If a 5 is showing, then the value 10 is in the dirt, and this is the best-possible orientation for that coin.  If the visible value is 50, then the value 10 is in the dirt.  Flipping the coin creates 40 fruits worth of value.  When we see the value 10, the number facing down into the dirt is equally like to be a 50 or a 5, and the 40-fruit benefit of leaving the 50 facing down outweighs the 5-fruit cost of leaving a 5 facing down.  So it is best to leave the 10s alone.  The professional switcher simply looks for surprising good value going to waste and corrects only that sort of error.  A similar strategy applies, if the accuracy of the nut-reading is imperfect, for any information is better than none. 

Before opening the envelope:

We have a row of nuts laid out, and I can't read the values at all since I wasn't raised as a nut-reading savant.  Would we do better to flip all of our nuts?  We notice that the children flip 1 nut for every 3 they leave in place, so perhaps we should flip none of our nuts.  On the other hand, if we simply toss the nuts down in a random way, it seems clear that we can't expect to improve on the productivity of the row of trees by flipping every nut. 

It seems to me that this is the paradox: With information, you can improve the expected value of the line of trees.  Without information, you can't.  Indeed, the expected value of flipping a nut is:

    the value of flipping a 5, times the proportion of 5's,
+    the value of flipping a 10, times the proportion of 10's,
+    the value of flipping a top-value V, times the proportion of top-values.

If we suppose that coins A-T and T-V are equally likely, then the proportions of 5's, 10's, and 50's are 25% - 50% - 25%, and that calculation yields:

    -(T-A) times 25%
+    -(V-T)/2 + (T-A)/2 times 50%
+    (V-T) times 25%

The first term is the cost of flipping the A=5's into the earth; we plant A=5 instead of T=10 and lose T-A fruits.  This happens to 1/4 of the coins in the tree-line, which happen to be showing a A=5.  The second term is the cost of flipping the T's.  We plant T, and so we are sure to gain T fruits, but lose V=50 or A=5 with equal probability.  The third term is the benefit from flipping the V's.  The children are always glad to see a top-value coin and flip it into the ground.  The benefit of doing this is V-T fruits.  About 25% of the nuts in the tree-line are showing the top value V=50 before being flipped.

Now, when we add those terms together, -5/4 - 40/4 + 5/4 + 40/4 = 0, and you can see from the variables that the cancellation is algebraic: the costs and benefits of flipping 10s balance the costs of flipping 5s into the dirt and the benefits of flipping 50s into the dirt. 

Thursday, June 16, 2011

Gambling with the Rent

The working poor take a gamble on reduced rent: People who rent their homes and regularly lose at gambling seek smaller homes, allocating a significant portion p% of their losses to reduced demand for rental housing space.  People who rent their homes and suddenly win at gambling cannot seek to rent a larger home, since the winnings are not constant.  Therefore, they allocate much less than p% of their winnings to an increased demand for rental housing space.  Therefore, the gamblers as a whole demand less rental housing space and shift that demand to something else -- presumably consumption and investments.  Furthermore, many gamblers who win big will buy their homes or pay their mortgages or in other ways stop being a person who works, earns stable income, and pays a significant portion of that income in housing rent. 

Rental housing prices fluctuate more than most goods available to working renters.  Suppose working renters as a class allocate 10% of their income to gambling in one country, whereas they do not gamble in another country, where they pay r% of their income in rent.  The working gamblers earn W when they don't gamble; their landlords earn r% of W.  Where gambling occurs, the landlords earn r% of 90% of W, which equals r% of W minus p% of 10% of W.  This implies that p = r, which is consistent with the fact that r balances your marginal preference for a slightly larger apartment with your preference for other expenditures, whereas p balances your marginal preference for a smaller apartment with your preference to cut back on other expenses.  Working renters who demand 10% less beer will get 10% less beer.  But working renters who demand 10% less apartment space may find that landlords are not willing to leave too many apartments vacant.  The inflexibly large supply of rental apartments will cause the price of renting to fall until the renters are willing to take all the existing apartments (minus those which the landlords are willing to leave vacant) at the price which the renters are willing to pay.  The renters may end up paying 10% less for 2% less space.  That ratio 8%/10% -- when working renters pay 10% less in total, they get 8% more space for free -- measures the "inelasticity of the housing market."  We'll write n% for the "inelastic generosity" of landlords, it might be 80% in the short-term. 

Landlords subsidize gambling:

Thus, paradoxically, the working poor could spend a portion of their income gambling, reduce all their expenses, including reducing the price they pay monthly for housing, and it's about 80% true that their landlords will let them keep their apartments and pay the same rent.

Do winners win, or only the lottery?

We suppose that the biggest impact of gambling is to transfer wealth from losing gamblers to winning gamblers.  The remainder -- salaries, casinos and other infrastructure, taxes, charitable donations and profits, or whatever governments and lotteries buy with their money --  should be relatively small.  The transfer of wealth between losers and winners decreases demand for those products which losing gamblers buy and increases demand for those products which gamblers buy after they win.  Losing gamblers consume less, but consumption markets can simply contract.  But so far we have not considered the economic impact of winners.  Working renters will apparently see their losses reduced by their landlords (who pay only r% times n% of the losses -- maybe 30% times 80% or just under a fourth -- but who can complain about receiving a fourth of their losses back?) regardless of how they lose their money.

Landlord generosity, an example.

Rental markets for working people are investments in fixed resources -- for this market to contract, housing prices would have to fall to the point where some renters would buy their apartments outright.  But falling prices for small apartments and falling rents are the same thing.  If the number of apartments does not contract  (that is, if the landlords are all unwilling to leave any apartments empty in order to drive up prices), then n% is 100% -- the price of rent falls until the renters can again afford it.  Perhaps the landlords can changing the renters' preference p% for apartments over other goods and services slightly, but probably not by the entire value of the renters' loss of income.  One could imagine gamblers a,b,c,...z who rent houses A,B,C...Z, where the houses are ranked from the nicest to the worst.  If all the gamblers loose, they should all spend less on their housing.  gambler z moves out into the street; gambler y moves to apartment Z, and so on, leaving apartment A free. One gambler has won; gambler g moves out of the apartments all together.  Gambler g might rent apartment A (which is now vacant) or might buy (it).  The fact that one apartment now stands empty forces rents down until gamblers a,b,c,d,e,f,h...z move back into apartments B...Z, at lower rent.  In fact, if rent is the only fixed resource which the renters are consuming, then all their other consumption decreases should be met by contractions in those markets, causing all those prices to be fixed.  Only their rent will diminish, and their rents should be lowered by exactly r% of their gambling losses.

Economic rent:

It may seem unfair to the landlords that they pay for someone else's gambling habits, and pay at 80% !  Of course, if the landlords were collecting only the value of the stream of services provided by an apartment (those services including shelter, safety, rest, showers, space to cook and eat hot meals, etc.) then they would charge an inflexible price -- below this price, it is cheaper to them to leave the house vacant than to let you grind dirt into the nice floors, cause fire hazards with your candles, shoot up the place, burn the furniture, drive wheelbarrows through the hedges and operate a meth lab in the bathroom or whatever else turns your fancy but runs down an apartment.  The price of an apartment would equal the expected costs of upkeep, plus some interest on the investments (the furniture in a furnished apartment, in addition to the cost of upkeep, cost the landlord the chance to invest elsewhere, so the landlord very reasonably expects to get a return on that investment).  But, in addition, there is some "economic rent" earned by those with deeds to municipal property because land in a city is scarce, and since cities are profitable.  The fact that the city has become more profitable is information that the landlord only learns when commercial and housing rental prices per square meter rise.  It seems, from what I read in economics, that the price of rent includes economic rent on very valuable municipal property.  Economic theorists have proposed that a city which increases its property tax and decreases its income tax would free the marketplace of some inefficiencies (caused by income tax, but not caused by property tax, basically because income tax operates "at the margin" where it kills all transactions which would be marginally efficient without it).

Differential rent:  Do landlords capture workers' expected profits?

Suppose you get a job offer in a faraway city.  You could go there, rent an apartment, and do the job.  Or you could stay somewhere where you are already welcome, rent-free, and you could find the best job available there.  Perhaps your parents are tolerant, or perhaps you have exceptionally supportive siblings.  The expected profit from each choice has a certain profit, where profit is winnings minus expenses:

Take the job in the city: Profits are income minus taxes, pension, transportation, food...
Write your book: Profits are expected sales minus taxes and some contribution to your benefactor's budget.

If the landlords are the only ones in this story who own any fixed resource, then by a law of economics (that all profits go to someone who has a fixed resource) the profits accrue to the landlords.  The total rent extracted from each worker then should be that worker's expected profits in the city versus staying home, where profits are income minus whatever expenses are so dear to the worker that he would rather pay them than move out of his hovel.

Here we suppose that the economic rent on a worker's unique and personal abilities is one reason the worker earns profit (income above opportunity cost); the economic rent on land, which is another scarce resource, is the reason that landlords earn profit, and that if the landlords control access to the city labor market, their profits should be differential rent -- the profit you get working a city job minus the profit you get by writing your book.  The landlord extracts this differential rent by renting squares of land and cubes of air at a fairly constant price, but by varying the quality, so that the squares available in cheaper apartments are of such a terrible quality that no one who can afford a better place will dare to switch to a worse place.  Do they succeed in extracting differential rent?  If you never consider moving to a smaller apartment, but you do sometimes consider moving away from the city and writing your book, then perhaps they are succeeding in restricting your choices to the differential. 

What do winners buy?

It would seem that renters gambling causes a flow of money from slumlords to high-end landlords: everyone who loses loses a little and all rents diminish.  The winners invest some money, which drives up the value of fixed investments, such as property.  The winners consume some high-end products.
The market for these products expands.  This increases the demand for services and materials, increasing wages and paying the owners of natural resources.  I have argued for the utility of gambling and then investing your money, so I hope that winners buy investments, as did this investing lottery winner.

Reports

If landlords reimburse the working renters for their collective gambling, it will have been noticed that gambling depresses rental prices.  Studying the economics of gambling and residential property is complicated, however.  The most direct effect of making-gambling-legal seems to be the rise in high-end property values.  This might be completely driven by the development of large casinos.  Perhaps we need to look not at Las Vegas but at Oregon or the effect on the economy when home games of poker became popular. 

The article _Early impacts of limited stakes casino gambling_ by PT Long, 1996, states: "Although not true in all of the gambling communities, residential property in Black Hawk and Central City has experienced a substantial decline..." while a newspaper report on Macau property reports the rise of high-end property values through the story of an individual company: "Instead of casinos, the partners launched an opportunity fund in 2006 to invest in high-end residential, retail and commercial property in Macau to capitalise on the boom in gambling...  Riding on the back of the gambling boom in Macau, the value of the five assets owned by the trust rose..." and the Financial Times reported: in its article _Asia-Pacific - Macao tries to cool housing market_ on 29 Sep 2010 that "While gaming revenue rose rapidly, however, so did house prices. Residential property prices in Macao have increased almost a third..." and an academic review of the effects of gambling reports higher property values, without mentioning whether they are high-end or low-end, but mentions only business property: "higher property values lead to higher property taxes, which may make it more difficult for small business renters (though not necessarily for property owners)... Obviously, an increase in property values (and hence taxes) puts a squeeze on some operators, especially renters (property owning restauranteurs, even if they went out of business, reaped the gains of these property value increases), and again this is more a matter of distributional than aggregate impacts."  The obvious truth that casinos bring jobs, which increase rental prices is brought home by this commentary: A new industry like casino gaming may have jobs and increased income associated with it. These amenities will induce an increase in property values. 

Are economic rents lower in home-rental prices in gambling towns?

Consistent with these predictions, I checked rental prices in ZIP Codes with similar populations, similar population density, and similar distance to the center of a city of half a million people -- finding
cities of a similar size and general population density , finding similar postcodes in those cities, and
using a rent calculator to find prices.  The ratio of the average price for a rental house to the average home property value is lower in suburban Las Vegas than it is in Portland, which is lower than it is in Denver.

Zip codes: 89109 (Nevada) 97220 (Oregon) 80121 (Colorado)
Price per square meter for a 1BR house: 0.96, 1.07, 1.10
Price per square meter for a 2BR house: 0.86, 0.90, 1.02
Price per square meter for a 3BR house: 0.76, 0.90, 0.99
Price per square meter for a 4BR house: 0.71, 0.78, 0.81

I'm assuming that working renters in Las Vegas will gamble more than working renters in Portland, who will gamble more than working renters in Denver.  And while these cities have the same size in population, there are lots of differences between Las Vegas, Portland, and Denver other than a presumption that the working renters gamble more in one place.

Gambling for your class.

Widespread gambling makes a poor worker less profitable, but their rents would drop by a comparable amount.  The few who win large sums and invest them and earn interest might stay in the city -- and compete for apartment space with those who earn high profits -- or might move home and collect dividends on their gambling wins without paying rent at all.  They no longer compete with the rest of their peers to rent apartments of the same quality as they have been renting.  Those who win small prizes can consume these small prizes and enjoy them, without worrying that those small prizes will be confiscated as rent.  This is perhaps the best part of the scheme: while your losses do decrease your rent, your winnings do not increase your rent, because if you can't predict it, neither can your landlord.

Tuesday, June 14, 2011

Is utility unbounded?

 A thought-experiment.

If utility is unlimited, then an infinite amount of money, offered at any finite odds, is infinitely more desirable than any finite amount of money.  We would notice this effect in that infinite prizes in lotteries would be surprisingly tempting.  Are you currently stealing from your friends in order to play high-stakes lotteries?  Would you gamble everything to play the lottery if the jackpot were infinite?  If you answer "no" and "yes," then maybe your utility curve is unlimited.  People do play lotteries more when the jackpot rises.  This change suggests that people are able to manage their probabilities, make side bets, divide potential lottery winnings and otherwise make real sense out of quantities of money far greater than they have ever earned or consumed.  Apparently, when faced with a lottery which pays hundreds of millions of coin at thousand-million-to-one odds, a 50% increase in the jackpot increases the players' interest.  What would happen if the odds remained fixed at thousand-million-to-one but the jackpot were infinite?  We might prepare for the shock to the global economy and government as the world receives its monarch.  Would you, in addition, try to win?  Would you give up every comfort in your life to win?  Would you sell, steal, borrow?  Would you attempt grand crimes to obtain vast sums with which to win the lottery?  Would you abuse the trust of your family and friends?  If not, then you have put a value c on your own comfort and the good you can do as a good citizen, and you have put a value v on economic omnipotence such that

  v < a thousand million times c. 

v is the utility bound -- the net sum of all the good you could do and all the fun you could have with unlimited resources.  On the other hand, if you are reading this and if your conscience whispers "Yes, I would beg borrow and steal for a chance at v" then you have valued v > a thousand million times c.  That wouldn't prove that utility is unbounded.  To prove that utility is unbounded, we should give you worse odds, such as a trillion-trillion to one, and better comforts, say c' = all the good you can do and all the fun you can have when you step into the role of someone -- anyone -- who you think seems to have a lot of fun and/or do a lot of good.  If your heart knows that would happily give up all that person's pleasures and good work in order to ruin his life with your gambling addiction, then you believe that

  v > a trillion trillion times c'. 

We could check that v exceeds any finite bound by setting the odds arbitrarily low.  A person who believes in unbounded utility offers to behave badly if he or she believes in the existence of an infinite lottery.  Such a person would happily suffer any finite setback and cause any finite amount of damage in exchange for a chance at winning v.  Perhaps your neighbors are such people; your neighbors act normally because they do not believe in an infinite lottery and they are unwilling to behave badly in order to win the kinds of lotteries which they are offered.  If the neighbor is not stealing from you and buying lottery tickets with the stolen coins, it is because the neighbor values

a lottery jackpot * lottery odds < the cost of stealing a coin . 

"Infinity"

It's a strange value in cost-benefit analysis because it continues to have its value ay any odds, and it outweighs any finite cost.  We avoid this by believing in no infinitely-valuable properties.  If a thought experiment asks us to accept that something (the prize in a lottery) might have a value infinitely greater than anything else in the universe, we can reject that notion with the Archimedian principle that everything is comparable to everything else -- all pains and pleasures, goods and and evils are comparable to each other.  Or, if some things are infinitely better than others, we can ask whether there is a world of top goods which are all comparable. -- perhaps my own good citizenship has far-reaching and "infinite" consequences.

Data:

Someone who says "utility is unbounded" can depress the utility(coins) function to the point where we cannot tell the difference between his bets and those of someone who believes utility to be bounded by examining data-points with realistic expected values.  You can fit to a scatter-plot of observations functions which are bounded or unbounded.  Maybe we can find some data about willingness to play the lottery for very high values, and this would show the upper end of the utility curve.  The surprising result seems to be that people will play a lottery for low expected value if the prize is high, suggesting that they value the marginal coin more than a coin in the hand; that makes sense to me only in terms of investing that coin, not consuming it.  I wrote about consumption, investment, and gambling when discussing the utility of gambling.

Friday, June 10, 2011

The Utility of Gambling

People gamble.

Economists assume people use their money rationally.  If a simple model suggests that a common behavior is a losing strategy, economists seek an extended model which elucidates the behavior.  If the extended model makes predictions, these should be tested against data.  Gambling "violates stochastic dominance," which is the usual argument: "If the expected payoff is 96% of the bet, then playing is irrational."  Gambling suggests that the utility of money is increasing, whereas in many surveys the utility of money is seen to be decreasing.  An economist might predict that when the utility of money is increasing for a rational person, then that person will gamble.

Gambling suggests that ROI increases with wealth.

Warren Buffett makes a better ROI on his investments than I do.  If all of that advantage comes from his brains, then I can't copy him.  But if part of his advantage comes from being rich already, then I should gamble.

ROI model:

Some investments generate an income stream for the owner.  Stocks and bonds pay dividends.  Owning a house near where you work generates an income stream -- you don't have to rent a house, and if you take a long vacation, you can rent it out.  Suppose the investment opportunities I would face after winning a gamble are better than those which I face now.  To take a simple model, suppose that an individual with wealth w can find (by using her spare time to learn about new business ventures, or because she can afford to diversify into risky ventures without risking her daily quality of life, or because large sums of wealth are slightly easier to manage, or because she can put some of it into long-term investments and she doesn't have to keep it in a cash account) earns interest rate 2+log(w) on her wealth.

If the Anderson family invests 1 coin and compounds it 100 times at that rate of interest, they will then have earned 27.71 coins.  The Bakers and the Cooks follow this strategy for 10 days.  At that point they get "anxious" to reach their savings goal.  They take a gamble, to win 0.1 coin or lose 0.1 coin with odds of 51:49 in favor of losing the coin.  I.e., they accept odds slightly worse than 50:50, so that the casino could make a profit.  The Bakers lose and the Cooks win.  They then proceed to compound their money 100 times at the variable rate of interest.

That is... the Andersons apply the following iteration 100 times:

wealth = wealth * (1 +  0.01 * (2 + the natural log of their (wealth) ))

The Bakers and the Cooks do likewise, but after compounding their initial 1 coin 10 times, they then gamble 0.1 coin -- the Cooks add 0.1 coin to their wealth and the Bakers lose 0.1 coin.  The results, after compounding 100 times, is:

A: 27.713150
B: 22.654267
C: 33.364858

The expected wealth of those who follow the Bakers-Cooks strategy is: (B * 0.51 + C * 0.49) since there are 51 losers to every 49 winners in the lottery.  That value exceeds the result realized by A.

Of course, the Cooks and Bakers can do much better if they would bet with each other.

Saving towards a goal.

In real life, people often gamble so as to "make up the difference" between their savings and a desired investment goal -- a house or a business, for example.  When you have a goal in mind, and when you expect that goal to increase your quality of life and generate an income stream with a better ROI than the investments you already own, is it rational to gamble your savings and take a chance on securing the new, desired investment either earlier or later than you would expect?

An example -- brothers buying houses.

Suppose my next purchase will be a house and that I want to buy it with cash so that I will have neither rent nor mortgage payments coming due monthly.  This investment thereby represents an income stream to me.  I can wait for the cash to accumulate and then buy, and then begin to enjoy the utility and income stream from the purchase.  Or, when I have some portion of the money and find a convenient opportunity to buy, I can gamble and stake my savings against the money needed for the investment.  If I win, then I get the income stream early, with its increased ROI.  If I lose, then I get it later.  The average of these two conditions is better than simply waiting.  Suppose that when I start work, my income stream is $50 per day (above expenses).  Suppose that at this rate it will take me 20 years to save the money to buy a house and that I earn no interest on my savings.  Suppose that, having got the house, my income stream will be $150 per day.  After working for 10 years, my brothers gamble.  51% of them lose everything -- their savings are now 0 and they continue to earn income at $50 per day.  On the other hand, 49% of them get their house and now earn income at $150 per day.  My brothers' average income stream is now greater than my own.  The gamble was similar to a pact -- they could have decided to pool their money and get houses, one by one, as they were able to do so.  Gambling and pacts are efficient if ROI is higher for greater sums of wealth.  The Teachers' Credit Union may be a pact allowing teachers to earn ROI available to their total wealth, rather than the ROI available to their individual wealth.

A friendly game of poker:

If you want to buy a house, then join my poker club.  When we have enough savings, between us, then we will play poker (or, if someone is too good at that game, we'll play something more random) until we all lose our small investments and one person has all the money and goes to buy the house.  We could have done some complicated thing where we give him the money and then force him to continue to contribute, but gambling makes this more simple.  If there are only 3 of us and the cost of a house is 6*x, then one of us will save 2*x and then get the house (and the income stream!); another will save 2*x and lose it and then save 3*x and get a house; the third will lose 2*x, lose 3*x, and then save 6*x and get the house.  Together, we saved the price of three houses and some of us got the income stream early.  If ROI is equal for the rich and the poor, then it was all a game.  But if the ROI we get from owning a home is better than the ROI we earned on the savings while we saved it, then it is clever to lose your savings so that another you can close the deal on the good life.

Lotteries:

In a lottery, the payoff can be very low.  Considering that the lottery winner pays tax and that the lottery payoff may be only 50%, the payoff might  be 25%.  That is still rational if the ROI on winnings is more than 4 times greater than the ROI the player was making the money which the player gambles.
Clean Meter for http://wnio.blogspot.com/
Click to verify