Gambling Wisdom: Kelly Criterion Part 1

Economics, Gambling (general)

October 8, 2008

In this series I discuss topics that are well known within the gambling community, but that may be new to players who come to the game of poker from a non-gambling background.

Maybe you’ve had this experience –  I know I have.  You’re studying some form of math, and you’re not quite clear on why it works the way it does or what it has to do with anything – someone’s just stuck this formula in front of you.   And then you go back and read about why it was invented in the first place, and suddenly you realize you’re holding the answer to everything, or at least a much bigger chunk of everything than you originally thought. I remember this happening with differential calculus – I was pretty good at calculating derivatives of stuff, and knew they were in essence the slope of the function in question, but I wasn’t sure why I cared.  Then by chance I was reading something by Newton, and it hit me: THIS IS THE MATH OF STUFF THAT MOVES!  From then on, calculus made perfect sense.

I just had a similar experience with the “Kelly Criterion” – John L. Kelly’s formula for sizing bets.  And like the helpful guy I am, I want to share.   I previous wrote here about how I didn’t like Kelly Criterion for bankroll management for poker.  And I still don’t, at least as it’s usually presented.  But I now believe that Kelly was far more insightful that I originally gave him credit for.  Like happened with calculus in my little story above, the formula I was force fed isn’t that useful, but the idea behind it is brilliant.

Kelly’s Big Idea

Here’s Kelly’s idea: gamblers need to manage their money in a way fundamentally different from people who get their money via a salary or similar.  To illustrate the difference, let’s consider two cases:

Edward the Employee:Edward currently has $10,000 to his name.  By means of his employment, he gets paid an additional $2,000 per week.  He decides to spend $5,000 on a new bigscreen TV this week and has $5,000 left.  Ten weeks from now, he will have $25,000.  If he hadn’t bought the TV, he would have $30,000 instead – the expense of the TV subtracts directly from his bankroll indefinitely into the future.

Gary the Gambler: Gary likewise has $10,000 to his name.  By means of his gambling activities, he increases his bankroll by 20% each week.  He likewise decides to spend $5000 on a TV now. Ten weeks later, he will have increased his bankroll from $5,000 up to almost $31,000.  But if he hadn’t bought that TV, he would have increased his bankroll to almost $62,000 – the expense of the TV cuts his bankroll in half indefinitely into the future.

Stop and think about that for a second, because this is hugely important: for an employee, income and expenses are additive.  You just sum up all the income, subtract all the expenses, and that’s how much money you have left.  For a gambler, profit and loss are multiplicative – you take your starting bankroll, multiply by all the profits (5% profit would be a multiplier of 1.05) and losses( 4% loss would be a multiplier of 0.96) , and that’s how much money you end up with.

Now we get to Kelly’s observation: things that maximize expectation in an additive way fail to maximize expectation in a multiplicative way.  In other words, nearly everything you’ve ever learned about expectation math (including what I’ve previously written) is very subtly wrong. If you look at the old expectation article, you’ll notice that what we did is average the various outcomes of a bet weighted by likelihood to get our expectation.  That’s all well and good if you’re thinking about gains and losses in an additive way since averaging inherently involves adding.  But it’s not so good from the gambling perspective where wins and losses multiply.  So Kelly suggested an alternative, indeed a criterion for placing bets:

Gamblers should choose a play that maximizes the likelihood-weighted average of the logarithm of the possible results

Now that’s a mouthful! I know most people have an innate aversion to logarithms, but I hope you’ll stick with me on this, because I think this may be the most important concept in gambling.  It certainly makes the top 5.  If you’ve forgotten about exponents and logarithms, I suggest you read up here or do a little reading elsewhere before we go on.

Now that you’re refreshed on logarithm properties, I want to remind you of one in particular: adding logarithms is like multiplying numbers.  That’s property 1. in the “Properties of Logarithms” section, on the right hand side, in the PDF I linked you to above.  Now Kelly’s Criterion should make more sense – when you want to average a bunch of things (which inherently involves addition), but the things you want to average are multiplicative instead of additive in nature, use their logarithms instead.  Since multiplying is like adding logarithms, you can now happily average the logs.  For the math geeks out there this is in essence the same concept as taking a weighted geometric mean.

Some Examples & Avoiding Going Broke

Let’s try to put the Kelly Criterion into practice and see how it differs from raw expectation.  Suppose we have the following gamble: we put in $1, and have 1:1 odds of getting paid $4.  The rest of the time we get nothing.  This gamble scales – we can put in as many $ at once as we like and get paid proportionally, and is repeatable.  Clearly this is a license to print money – the expectation is that we get out $2 for every $1 we put in, netting out a $1 win on average.  So how much of our bankroll should we wager every time – let’s say our bankroll is $1000?

Basic expectation math says we should wager our entire bankroll.   That gives us an expectation of winning $1000 on average.  And if this were a one time opportunity and we intended to give up gambling afterward, we should indeed wager everything. But that’s not how this example works – the gamble is repeatable.  Thus, there is an additional cost to going broke – we have to give up on this incredible gambling opportunity.  It should strike you that this is exactly like the “Gary the Gambler” example above – if we multiply our bankroll by zero (ie. go broke), that gets carried forward indefinitely and we loose the opportunity to make money in the future.  What this means is in cases where you have a repeated profitable gamble you absolutely must  avoid going broke, or even losing a big fraction of your bankroll because the loss of future opportunities to gamble is so costly.

So clearly we’re going to want to bet some fraction of our bankroll, not all our bankroll.  This is where the Kelly Criterion comes into play.

define F as the fraction of our bankroll we will wager (F is between 0 and 1)

define P = 0.5 – the probability we win our wager

define Q = 1- P = 0.5 – the probability we lose our wager

define R = 3 – the amount we profit if we win (assuming we wagered 1 unit) (4 units back minus 1 unit cost)

define our starting bankroll to be 1 unit in size

We want to find the value of F that maximizes the average of the log of the two possible results (we win, or we loose).  Notice that each result is expressed in terms of our final bankroll, not just the amount we won or lost on the wager:

If we win, our resulting bankroll will be 1 + RF

If we lose, our resulting bankroll will be 1-F

so we want to find the F that maximizes

P log (1 + RF ) – Q log (1 – F)

Now, if you’ve forgotten how to maximize functions with respect to one variable, you differentiate with respect to that variable, and the zeros in the resulting function are maxima of the original function.  I’ll leave the math an an exercise to the reader and tell you that you get:

F = (RP-Q)/R

which is the same formula you’d find if you looked up the Kelly Criterion here – I’ve just got slightly different variable names.

Then we plug in our values for R,P, and Q and get that F=0.333… – we should wager a third of our bankroll each trial to maximize the rate of growth of our bankroll.

Ok, that was some pretty heavy lifting.  I hope I’ve got you thinking about the Kelly Criterion and haven’t bored you to tears – There will be a followup article shortly because I’ve got a ton more to say about this.

Like this article? Subscribe to the CardSharp RSS Feed

7 Responses to “Gambling Wisdom: Kelly Criterion Part 1”

  1. jUzAm Says:

    Hello Wayne,

    I have a little question :
    I would like to know how much to buy-in into online cash games while managing my bankroll via the Kelly Criterion
    I play full 100 big blinds deep

    If I have, say, a winrate of 3 big bets (6 big blinds) per 100 hands, and a standard deviation of, say 50 big bets per 100 hands, which percentage of my bankroll should I use to buy-in ?
    Could you adapt the formula to use variables such as winrate and standard deviation instead of odds and probability of a bet ?

    Thank you again for all your great articles !

  2. Wayne Vinson Says:

    That’s a good question. I’m not sure off the top of my head. Let me think about it a bit.

  3. jUzAm Says:

    I thought about a thing like that :

    F = (RP-Q)/R
    define D the standard deviation
    define W the winrate
    R = (D+W)/D
    P = Q = 0,5

    but I strongly doubt it’s correct…

    Btw, thank you in advance (I asked the same question to some other ppl, so I’ll let you know if you don’t find and they do)

  4. jUzAm Says:

    I found a post on 2p2 forum that fits what I searched :

    jay_shark :
    The Kelly criterion says that you have a 2% advantage; i.e., you win 51% of the time, then you should be willing to invest 2% of your bankroll at even money.

    In cash games, we usually measure our win-rate in terms of big blinds/100 hands or $’s /100 hands but you can measure your win-rate in terms of percentages as well.

    If we regard your standard deviation as the size of the bet of the coin flip, then we have
    wr = b*(p-(1-p)) = b*(2p-1) where b is your standard deviation.

    example 1 :
    If you’re playing NL100, with a win-rate of 10 big blinds/100 hands and a standard deviation of 100 big blinds/100 hands, then your win rate in terms of percentages may be regarded as

    10 = 100*(2p-1)

    solving for p we get p=55% and your advantage is 10% which means you need to invest 10% of your bankroll to maximize the growth rate of your bankroll.

    example 2:
    Your wr = 14 big blinds/100 hands and your standard deviation is 95 big blinds/100 hands.

    14 = 95*(2p-1)

    solving for p we get,
    p ~ 57.368% . So your advantage in terms of %’s is 2*p-1 which is about 14.736 %

    There is a nice simplification to the size of your bankroll to maximize bankroll growth rates.

    It is simply
    B= s.d^2/(k*wr)[/B] where k is your kelly fraction.

    Plugging in k=1 for examples 1 and 2 respectively
    we get B = 100^2/(10*1) or B=1000 and B= 95^2/(14*1) or B~ 644.6428

    It is a routine check to see that 100/1000 = 10% and that 95/644.6428 is ~ 14.736%

    Source :

  5. Wayne Vinson Says:

    I’ll have to read all that and give it some thought.

  6. jUzAm Says:

    Sorry for all these comments, nothing appeared on my screen so I posted a lot of same comments, please delete all but one, and sorry again

  7. Wayne Vinson Says:

    Comments have to be approved before they appear. Sadly I get hundreds of spam a day. I think I cleaned everything up.

Leave a Reply