Stay informed with the
NEW Casino City Times newsletter!
Best of Donald Catlin
Shuffle Trackers: Try Goin' Crazy21 January 2001
I recently received a request from a gaming developer, one of my customers, to do an additional analysis on one of the bets in one of his card games. Specifically, he was worried about the effect of shuffle trackers (sometimes called slug cutters). In case you are unaware, a shuffle tracker is a person who watches for clumps (or slugs) of discards that are presumed good for the player, trys to follow the clump, or perhaps part of it, through the shuffle, and then cuts these cards to the beginning of the next shoe.
The bet in question was a bet on a tie between the dealer and the player. As in many card games, e.g., Baccarat and Blackjack, the tens and face cards all had the same value. This developer was worried that if there were known to be some tens in the two hands, that this bet might be favorable to the player. As we will see, the matter turned out to be surprisingly complex and led to a fascinating story.
No developer wants his or her game to be mentioned in the same breath as shuffle tracking, so I am not going to use the actual game in this article. Rather, I am going to make up a couple of games; one to illustrate the turn of events and the other to help you understand why the results turned out the way they did. So to paraphrase the the old Dragnet show: The story you are about to hear is true. Only the games been have changed to protect the innovator.
Let us suppose that the game in question is played with four decks, the card values are as shown (Ace being 1) and all face cards are value 10. Player and dealer each receive two cards and each one's card values are added to obtain a total of 2 through 20. If the totals are the same, called a tie, the player wins, otherwise the player loses. I wrote a short program to calculate probabilities for this game and it turns out that the probability of a tie is approximately 0.0649. Thus a payoff of 10 to 1 for a tie would produce a house edge of 28.61% (the real game is more generous than this).
Obviously if one knew that the first four cards out of the shoe were all 10s, the tie would be a sure bet. What if fewer than four 10s were known to be among the first four cards? Using the 10 to 1 payoff, my program produced the following results:
I should mention that in the real game the figure for three tens was actually positive and larger than the figure for two tens. This made the following turn of events even more dramatic.
I reported my results to the developer, who quietly thanked me and went about his business. The next day I got a call at my university office from him saying that I must have made a mistake. He pointed out that if three of the first four cards were tens, then the remaining 205 cards would contain 61 tens. The only way to tie would be if the fourth card were a ten and the probability of this happening is 61/205 or 0.29756. This would give the player an edge of 237.32% and not 5.34%. Yikes! The calculation seemed simple enough and sounded correct, so I told the developer that I must have made a mistake somewhere and that as soon as I got home I would recheck my program, fix whatever was wrong, and call him back with (presumably) corrected results. Needless to say I was embarrassed and upset with myself. How could I have made such an obvious error?
When I got home I brought up my program and looked it over. It is not a complicated program and everything looked okay to me. John Robison once told me the following. "Insanity is repeating the same thing over and over under the exact same conditions and expecting a different outcome each time. Unless you're dealing with the internet." Well, I wasn't dealing with the internet but I did the insane thing anyway - I reran the program. Guess what? I got exactly the same results. Now I was going crazy. What in the world was going on here? I looked at the program again, in detail. It was fine. Finally out of sheer desperation I set up a scenario in my program wherein the top three cards off of the deck were tens and the fourth was unknown. Continuing with my insane behavior I reran the program under what I presumed to be the exact same circumstances and --- what? --- I saw a house edge of -227.32%. Now I was really going crazy. I then decided to hand calculate everything and that is when it hit me. There was nothing wrong. I had been experiencing a probabilistic phenomenon that is behind an old, old sucker bet involving three pennies. I'll show you this in a moment. I was able to hand calculate most of the numbers that I'll present below and these exact same numbers were computed by my computer program. The program was fine. I had suckered myself.
Let me revise the above table as follows. I assure you that as strange as they may look to you, these numbers are absolutely correct.
Now you're going crazy right? How can those last two numbers both be correct? If you're interested, write to me and I'll show you how to hand calculate both of them. Rather than do that here, which I don't think would be too enlightening, I should like to try to explain, with some simpler examples, just what is at work here.
What about that sucker bet with the pennies that tipped me off (that I was the sucker)? Here it is; I'll take the role of the wise guy (it's about time). I'll flip three pennies and offer you the following proposition. Whenever two of the pennies are heads you get to wager on the outcome that the third will also be a head; if there are no heads or one head the game is a push. Now if two of them are heads there is a 50-50 chance that the third one will be a head. So this is an even money game. But just to show you what a big sport I am, I'll give you 3 to 2 odds on the game. Sound good? Well, when you see the following, you might want to change that word sport to another "s" word.
Here is the sample space for the game. Think of the coins as penny 1, penny 2, and penny 3; HTH will mean that penny 1 was heads, penny 2 was tails, and penny 3 was heads.
Notice that there are four outcomes that have at least two heads and only one that has all three heads. The game is a 4 to 1 shot. So, at a 3 to 2 payoff, the player's expected return is
So I have a 37.5% advantage in this game. Some sport!
To see how this type of probabilistic misunderstanding can lead to the confusion that I and my developer experienced, I have designed a simple dice game called Goin Crazy that has features similar to the card game above but is easier to analyze. I think you see where I got the name.
The Goin' Crazy game is played with 4 dice, each one a different color from the others. All of the dice have the same markings; four of the sides have a 1 and two of the sides have a 0. The red and blue dice are called the player's dice; the yellow and green dice are the dealer's dice. All four dice are put in a dice cup and given a good shaking. The cup is inverted on the table on a special spot containing sensors. These sensors can read the color and up number of each die. The player's two numbers are added and the dealer's two numbers are added. The player bets on the two totals being the same.
Now unlike the chore of tracking a deck, we are going to freely offer the player information about what is under the cup. That is to say there are several possible forms of Goin' Crazy that we could offer the player depending upon what information we give him. Let's look at the sample space for this game and decide on a payoff that gives an edge to the house. Information will be given in the form red, blue, yellow and green outcomes in that order. For example, 1010 will mean that red shows 1, blue shows 0, yellow shows 1, and green shows 0. Notice that this is a tie outcome. There is a 2/3 probability of any die coming up 1 and a 1/3 probability of it coming up 0. The probability for the outcome at hand is 2/3 x 1/3 x 2/3 x 1/3 = 4/81. Rather than list all of these fractions, and to keep from driving John Robison crazy when he lays out this article, I'll just list these as frequencies out of 81 games. Here goes.
Looking at this table we see that there are six outcomes that are ties and the frequency of these in 81 games is 33. So the probability of a tie is 33/81 and an even payoff will provide the house with an edge of 15/81 or 18.52%. Now, since a 1 is more probable than a 0, it would seem that having 1s in one's hand would be an advantage. Is it? Suppose our sensor tells the player that at least one of the dice is 1? What is the probability of a tie given that there is a 1 somewhere? Well, the frequency of at least one 1 is 80 and the tie frequency in this case is 32. The probability of a tie given at least one 1 is, therefore, 32/80 and the house edge in this case is 20%; it went up. Compare this to our card game. Suppose instead that our sensor tells the player that the red die is 1. There are 54 such outcomes and 24 of them are ties. The probability of a tie in this case is 4/9 and the house edge is 1/9 or 11.11%; smaller. Again, compare this to our card game. What if our sensor says that there are at least two 1s under the cup? There are 72 such outcomes and 32 of them are ties. The probability of a tie is again 4/9 and the house edge is 11.11%. Suppose, on the other hand, the sensor tells the player that at least one player's die is 1 and at least one dealer's die is 1. In this case there are 64 such outcomes and 32 of them are ties; the house edge is zero.
Now, here comes the sucker bet. If I were going to market Goin' Crazy, and I am certainly not going to market this miserable game, this would be the pitch. On every roll the sensor tells the player whether or not there are three or more 1s under the cup. If there are not at least three, the game is a push. Otherwise the player can bet as much as he or she wants on the tie. Well, if there are already three 1s, the player needs just one more 1 to get a tie. Since 1s are twice as likely as 0s, this is a 2 to 1 shot. At even money this is a great opportunity and the player should bet the farm. Oh yeah? Take a look. There are 48 outcomes having three or more 1s and only 16 of these are ties. The probability of a tie is 1/3 and the house edge in this case is a whopping 33.33%. Wow! Compare this to the situation wherin the sensor tells us that both of the player's dice show 1s and the dealer's yellow die shows a 1. In this case the player has an edge of 33.33% over the house. I guess I wouldn't want to offer the game in this form. Compare these two situations to the last two entries of our card game table; see the similarities?
There is a lesson here for shuffle trackers. Knowing that a certain clump of cards is rich in what seems to be a favorable card for the player may at times be a snare and a delusion. Certainly in Blackjack this technique seems to work. But in a game with a tie or other similar wagers, I would certainly look at things long and hard before I laid out my big stake on slug cut. I just might be arranging a sucker bet for myself. See you next month.
This article is provided by the Frank Scoblete Network. Melissa A. Kaplan is the network's managing editor. If you would like to use this article on your website, please contact Casino City Press, the exclusive web syndication outlet for the Frank Scoblete Network. To contact Frank, please e-mail him at firstname.lastname@example.org.
Best of Donald Catlin