The Joel on Software Discussion Group (CLOSED)

A place to discuss Joel on Software. Now closed.

This community works best when people use their real names. Please register for a free account.

Other Groups:
Joel on Software
Business of Software
Design of Software (CLOSED)
.NET Questions (CLOSED)
TechInterview.org
CityDesk
FogBugz
Fog Creek Copilot


The Old Forum


Your hosts:
Albert D. Kallal
Li-Fan Chen
Stephen Jones

The Odds of Winning Klondike Solitaire

Theory:

The odds of winning any game of Klondike Solitaire of the form (Draw X, Re-Deal Y, Win Z) are equivalent to the average of the sum of the weighted ratios of the length of the longest increasing subsequence of each winning permutation to the length of the permutation.

Proof:

TBD

Thus for a "standard" game of Klondike of the form (Draw 3, Re-Deal Infinite, Win 52) the number of solvable games is 66.875% as determined by the above formula.  The number of unsolvable games is 0.025% and the number of games that cannot be won is 33.1%.

You may download my Solvable Solitaire game and play a few games or animate them. (Simply unzip on your desktop.)  Watch and Observe them.  The solver makes some strange decisions but they seem to be the right ones.

Load the unsolvable and the lost games. (Click 'Load' and point it to the "Games" Directory.) Attempt to play them yourself.  See if you can beat them.

The games have a naming convention like Lost_5_4.bin or Unplayable_5.bin.  Meaning Unplayable game number 5 and Lost game number 4 with a max of 5 cards to the suit piles.

If you load the Lost_45_1.bin deck file and animate it and think to yourself, "I can beat that."  So you play the game and you get more than 45 cards to the foundation piles then you have found a bug in the solver.  Same with any "lost" game.

The "games.bin" file contains games that have been "won." (52 cards to the foundation piles.)

Comments, questions welcome.  BBL.

http://www.doublebuffer.com/downloads/Solvable%20Solitaire.zip

Interesting link (first 8 minutes) "The Mathematics of Solitaire":

http://www.uwtv.org/programs/displayevent.aspx?rID=1986&fID=571
Dave B. Send private email
Thursday, February 01, 2007
 
 
What's the exact distinction between "unsolvable" and "a game that cannot be won"?

Also, how is "winnable" defined? It seems like you declare a game winnable if your solver can solve it, which works, but the converse doesn't. There's information that the solver doesn't have access to (in the face-down deck of cards) at the start of the game, so even a theoretically optimal solver may lose games that could have been won had it played differently.
Mark Jeffcoat Send private email
Thursday, February 01, 2007
 
 
Thanks for the reply Mark.

>> "What's the exact distinction between "unsolvable" and "a game that cannot be won"?"

Unsolvable is a typo and should be "unplayable."  In other words no cards can be moved - anywhere.  As opposed to a game that has been "lost" with 0 cards to the suit stacks.  In which case some cards can be moved just not to the suit stacks.  A game that has been won, in this case, has 52 cards placed to the suit stacks.  So you have unplayable, lost and won games.

>> "Also, how is "winnable" defined? It seems like you declare a game winnable if your solver can solve it, which works, but the converse doesn't."

So you're saying if my solver can't solve it - it may still be solvable.  This is a touchy subject. 

A game is a goal.  How one reaches that goal is defined by rules.  The name of a game normally refers to a set of rules. 

Game - Score the most points. Rules: Baseball, Basketball, Soccer, Hockey, etc...

Game - Sort a shuffled deck of cards into piles according to suit. Rules - Klondike, Vegas Klondike, Patience Sorting, Spider, etc... 

Rules amplify or lessen the odds based on strengths and weaknesses in the process used to accomplish the goal.

**Changing the rules changes the process used to accomplish the goal thus changing the odds.**

If one follows the rules and comes up short it follows that the goal is not accomplishable given that set of rules (and the strengths and weaknesses of the process used to carry out those rules).

So we have games that are won (Absolutely), games that are lost and games that are unplayable.  If you load a "lost" game and find that you can get more cards to the foundation piles than the game specifies, then you have proven the solver inadequate. (i.e. You load Lost_8_3.bin and get 12 cards to the foundation piles (without cheating)).

>> "There's information that the solver doesn't have access to (in the face-down deck of cards) at the start of the game, so even a theoretically optimal solver may lose games that could have been won had it played differently."

I'm not quite certain what you're saying here.  It's not in the rules to "know what the face down cards are".  We could change the rules to allow this but that would "change the game" thus changing the odds. (Of what advantage would that be anyway?)

Please don't miscontrue me.  This is a tough problem up for debate.  I don't know that I'm right or wrong.  Basically just throwing stuff out there to be mirrored back at me.
Dave B. Send private email
Thursday, February 01, 2007
 
 
Yeah, my explanation of how an optimal solver could conceivably lose winnable games was unclear. Let me pick a different example, and make it concrete:

(I'll be extra verbose for everyone who hasn't played this game recently. And because I get wordy when I'm tired.)

You make seven stacks of cards at the beginning of the game, each stack has its top card face-up. The smallest stack, on the far left, begins with only one face-up card, the largest, on the far right,  has one face-up card and six others face down.

Let's say the face-up cards of your initial deal look like this:

      A x x x x K K

That is, an ace, two kings, and other stuff.

It's clear (enough) that you always improve your position by playing that ace immediately, so let's assume that your optimal solver does so.

Now, you've got an empty slot, to which you could move one of your kings--often a good thing, since you get to turn over the card under it.

Assume, arguendo, that an optimal solver will move one of those kings. Unfortunately for it, this is a carefully contrived example, where under one of those kings is the crucial card, or set of cards, that you'll need to win this game. (Maybe it's all the 10s, maybe the other 3 aces, or something more subtle; it doesn't matter.)

No matter which king the solver moves, there exists a possible deal where the crucial cards were under the _other_ king. The game was winnable, but the optimal solver didn't solve it.

Why don't I immediately revoke the solver's claim to being "optimal", since it failed an unwinnable game?

There are 52 cards in the deck, and we can see seven of them before we make a move. (A few more if you're allowed to make a pass through the deck to get a look at every third card without playing any. I'll neglect that, since the argument is the same.) At the start of a game, then, the solver only knows that it's faced with 1 out of the 45! possible deals.

If moving the right-most king into the empty slot will lead to a win in (45!-1) of those deals, and only result in disaster for that last deal, it's clear what move the optimal solver will make. And if it fails 1/45! of the time, well, ain't none of us going to live long enough to see it.


Obviously, the numbers aren't that really one-sided, but the argument only fails if it can be shown that this situation--where you don't know for sure which of two or more moves is better because you can't see some of the deck--can never happen, or, at least, can always be avoided.

That would be quite difficult to prove. I rather doubt it can be proven at all, since I don't think it's true. But I won't attempt a complete counter-example right now, because I should really get back to work.
Mark Jeffcoat Send private email
Friday, February 02, 2007
 
 
>> "Unfortunately for it, this is a carefully contrived example, where under one of those kings is the crucial card, or set of cards, that you'll need to win this game."

>> "No matter which king the solver moves, there exists a possible deal where the crucial cards were under the _other_ king. The game was winnable, but the optimal solver didn't solve it."

How is the game winnable if, when played by the rules, the "crucial" cards could not be had?  The fact is that the game is not winnable and is considered a loss.

In his book "Strategic Solitaire," David Berveiler refers to these games as "lock-outs."  A lock out is a sequence of cards which makes a complete win impossible.

The solver will win the game if, when played by the rules, the game is winnable.  If not, then the game is considered a loss with at most X cards placed to the suit stacks.
Dave B. Send private email
Friday, February 02, 2007
 
 
I'm still entertained, so let's try another highly contrived example. This one takes place very near the end of a game:
                  K  K  (J)  [5]
   

  _  [7]  [8]  [9] [10]+ [K]+  (K)++


The two red suits have been completed, and only the two black suits () and [] remain. The left-most slot is open, and there are 4 cards still face-down, each represented by +. There are no cards left in the deck.

You have only two options: you must move either (K) or [K] into the empty slot, and flip over another card. If you uncover either [6] or (Q), you win. If not, you lose.

(There are 24 possible permutations of the 4 unknown cards: some of them are guaranteed wins (both kings lead to a winning card), and some are guaranteed losses. Those are boring, so let's only consider deals like [Q][6][J](Q) (the jack is on top of the queen). If you move [K], you'll uncover [6] and win. If you move (K), you uncover [J] and are completely stuck.)

The game is perfectly winnable from this point. The worst solver in the world will win about half the time. Unfortunately, if the shuffling is really random, even the most stupendously amazing of solvers can never learn anything about the position of the winning two cards without moving a king. So it won't win this perfectly winnable game more than about half the time, either.


This is, again, quite contrived, but I don't believe you can prove that an optimal solver will always be able to avoid positions where the best it can do is guess.

(None of this would be true, by the way, in a variant of the game where all the cards were exposed from the very beginning. It wouldn't make every game winnable, of course, but it's clear that the brute force method of (recursively) trying out every possible move and pruning the losers must always find a solution if one exists. )
Mark Jeffcoat Send private email
Friday, February 02, 2007
 
 
If you're questioning whether my solver would test both Kings the answer is - yes.

Like you said:

"it's clear that the brute force method of (recursively) trying out every possible move and pruning the losers must always find a solution if one exists"
Dave B. Send private email
Friday, February 02, 2007
 
 
I suppose that explains the confusion. You're either playing an entirely different game, or using a different definition of "solver" than I am.

I'll bow out now.
Mark Jeffcoat Send private email
Friday, February 02, 2007
 
 
Thanks for the questions Mark.  Feel free to post or email any further questions or comments you may have.
Dave B. Send private email
Friday, February 02, 2007
 
 
It's another long night of discussion with the compiler, and talking about solitaire is a fun diversion. So I'm going to take another crack at sorting out what's going on:

I now see two meaningful definitions of "solvable":

1. A game (that is, a single specific deal of the cards) is solvable if there exists at least one sequence of moves that will result in winning.

2. A game is solvable if it is solvable in sense (1), and there is a strategy that will guarantee finding a winning sequence of moves.


I think that you've been talking about solvable-1 all along, and I've been talking about solvable-2.

I've been arguing that it's very unlikely that every game that is 1-solvable is 2-solvable.

I still don't have the foggiest idea what your theory statement means, and how you used it to compute the number of solvable games (66.875%). (But I do see how you could get a plausible estimate by having your solver attack a large number of random deals.)
Mark Jeffcoat Send private email
Saturday, February 03, 2007
 
 
I'm not sure how to answer you Mark.

>>> "But I do see how you could get a plausible estimate by having your solver attack a large number of random deals."

This is definitely not how I came up with the percentage of winnable games.  It is however a plausible method of getting an estimate.  When I run say 10,000 games using the ran2 generator from Numerical Recipes (or the Windows Crypto Generator or the Mersenne Twister) I get numbers *around* 66.875% sometimes it's higher and sometimes it's lower.  There's some percentage of error there.

>>> "I still don't have the foggiest idea what your theory statement means, and how you used it to compute the number of solvable games (66.875%)."

The idea is this.  There are 5 moves that can be made during a game of Klondike Solitaire (discounting flipping the cards face up which should be done automatically).  They are:

1. Waste Pile to Foundation Pile
2. Waste Pile to Base Pile
3. Base Pile to Base Pile
4. Base Pile to Foundation Pile
5. Deal

5! = 120.  There are 120 possible permutations of these moves.  Not *all* of these are valid permutations.  In other words you can't win a game by following the moves in the order the permutation lists for each round of the game.  For example the permutations that begin with 5 are not valid because they Deal first and you can't win a game by just dealing.  Permutations beginning with other moves are also invalid for various reasons.  As it turns out, if you remove *all* invalid permutations you end up with 32 *valid* permutations scattered throughout the 120 total.  These have a unqiue spacing pattern similar to that of Prime Number spacing but not quit the same.  So we have 32 valid permutations.  Now, take the longest increasing subsequence of each of those 32 winning permutations.  You will get 32 numbers between 1 and 5.  Remember that the length of the permutaions is 5.

Count @ Longest Inc Sub = (LIS/Lenght of Perm * Count) = X
1 @5=(5/5*1)=1.0
11@4=(4/5*11)=8.8
18@3=(3/5*18)=10.8
2 @2=(2/5*2)=0.8
Total = 21.4
=21.4/32=0.66875=66.875%

Changing the algorithm used to solve the game changes all of these numbers and is directly related to the percentage of games the solver wins.

Hopefully you can follow that.  Feel free to ask any further quesions.  I'll do my best answer them.
Dave B. Send private email
Sunday, February 04, 2007
 
 

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics
 
Powered by FogBugz