The Things That Really Matter

Now that the year is drawing to a close and holidays are upon us, I now have time to revisit the things that really matter in life, including the Royal Game.

Some time ago I uploaded my Ninja Monkey code to GitHub (my username is hilarioushappystar) and now I wish to discuss this in more detail.

To generate a “result” I need two seeds which I call “Game seed” and “Monkey seed” for lack of better alternative. Let us assume a pair of seeds is written as (g,m), where g and m represent the game and monkey seeds respectively.

The game seed determines the initial position. The monkey seed determines the monkey’s behaviour. For instance, if the seeds are (27, 1) then the monkey would always start with the move “ca”, but if the seeds were (27, 2) then the monkey would prefer the move “cb” in the same starting position.

The screenshot below shows a game seed of 27 and a monkey seed of 1. The monkey loses with 24 face-down cards remaining. I do not claim this to be a paragon of virtue from a software engineering perspective.

Of course, the result is pseudo-random in the sense that with game seed = 27 and monkey seed = 1 the monkey always loses with exactly 24 facedown cards. At least that’s the result I get on my machine. Your machine may yield different results, but if your monkey loses with e.g. 16 face down cards the first time then it should lose with 16 face down cards every time.

Here are my results. Note that the game seeds are not indexed from 1 to 20. By using different seeds for monkey and game, I reduce the chances of confusion when trying to reproduce these results.

Note that these results encode the result of the game: if there is at least one face down card remaining the monkey must have lost. I assume the monkey always wins if it manages to expose every card in the tableau (otherwise I could always put an asterisk next to a zero if the unthinkable happens).

Game12345678910
2724272719101729262214
821724182092516182022
4114211011272715192017
1242693318212626302330
6227232723212621222125
311824024142516282620
9416172925152624272620
4718182021242818222623
142141121162323915199
7142713172122122188
21423333222223022322033
1071714172118181618015
32227262422262232252230
16120131514191116121212
48414141414119811011
242251505251423251923
1211720027261721271821
36426281924222018251924
18225232528262218311723
9191510131217771714

We can make a few observations:

  • Out of 20 games, there are 5 hands where Monkey scores one victory. For the other 15 hands the Monkey never wins
  • These five wins appear in only two columns. This is a statistical glitch and there is no logical reason why two games with same monkey seed and different game seed should be correlated. I blame the small sample size 😊 (about the only utility of columns is to assist in reproducing the raw results).
  • Some hands look really bad. For instance in game 214 the Monkey always has at least 20 face down cards at the end of the game.
  • Other hands look promising, for instance in game 484 the Monkey as at most 14 cards remaining

This is an example of an “exploratory analysis” (as opposed to explanatory analysis). I’m trying to get familiar with the data and I don’t have a specific hypothesis that I’m trying to prove. Of course, the more data you collect, the more chances of finding something interesting. For instance, I could have chosen to have 15 monkey seeds instead of 10, or 50 game seeds instead of 20.

Once you have completed your exploratory analysis, you might be able to form a concrete hypothesis about a spider program which you suspect to be dodgy. For instance, suppose that Shay Dee Games releases a new version of the Royal Game and we find that in every hand, either the Monkey consistently gets 10-or-less cards face-down or consistently gets 20-or-more cards face-down at the end of the game. We would suspect something is amiss, even if Shay Dee has the “correct” average win rate computed over all games. Of course, all this assumes we are able to determine the identity of every face-down card.

When testing a concrete hypothesis, things start to get technical. T’was brillig slithy toves gyre gimble wabe blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah Kolmogorov-Smirnov test blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah blah – or if you want the less technical version, yes we have formally proved that Shay Dee Games is indeed rigged.

Note also I have omitted certain stats. For instance, I could have recorded the maximum number of empty columns obtained at any stage of the game, number of suits removed or the number of levels in Toy Blast I manage to beat before the monkey finishes the game. I have also omitted discussion of specific programs that I suspect to be biased. The important point is the reader has something to go on if he wishes to investigate the veracity of a specific Spider Solitaire program.

Until next time, happy Spider Solitaire playing!

Monkey Algorithm User Interface Goes Graphical!

I have continued to achieve awesomeness in all things Spider Solitaire.

Hitherto my Spider Monkey algorithm program was text-based, but I have successfully managed to convert this into a graphical user interface and upload the new version onto GitHub.

The screenshot below shows an example position from a Spider Solitaire server which I suspect may be biased. But since I am not too confident of my suspicions (unlike the other one, where even a 1-suit algorithm was enough to expose the bias), I have redacted the name of the file, which would give away the software company in question.

The graphical interface allows the user to initialise a start-position from either a text file or generate a random position. The user can then get the AI to step through the hand one moveblock at a time (repeatedly clicking “single moveblock”) or fast-forward through the entire game (“clicking all moveblocks”). A moveblock is essentially a sequence of moves that exposes one or more cards, either through turning over cards in the tableau (woohoo!) or dealing from the stock (gasp!).

You may have noticed the game started with “ja” instead of the superior “cg” or “ij”. The reasons for this have been discussed in a previous post. The evaluation function is 10 points for a turn-over, 1 point for an in-suit build, and +100 for removing a complete suit (note that removing a complete suit loses the points gained for every in-suit build so the effective bonus is only +88 instead of +100).

Note that there is no “equity” assigned to empty columns or columns that are close to becoming empty. This could be a future task, either for myself or someone else who wishes to contribute to my project.

I have used a four-color deck instead of the usual red-black colors for most card games. I have also used gray for face-down cards and added a minus sign for good measure (to be color-blind friendly). The stock is shown in the bottom half of the window and I have decided to have the cards face up for ease of visibility (they could have equally well been face-down).

https://en.wikipedia.org/wiki/Four-color_deck

Here is another screenshot showing some extremely long moveblocks in action. Sometimes long move sequences are needed to resolve a critical position, but other times a long sequence is just idle moves such as oscillating an Ace between two different columns. If you see a sequence ending in [(5,9),(9,5),(5,9),(9,5),(5,9)] then it’s almost certain the algorithm is waiting to reach the move limit, which I have hard-coded as 30 – and yes, you are most welcome to download my code, get rid of the hard-coded parameters and send me a pull-request 😊

My code was written using tkinter which is a standard GUI library for Python. I only picked this up a few days ago, so I can’t claim my code is the most efficient (in terms of speed or readability), but it seems to get the job done and I am not aware of any mundafental muck-ups yet. So it’s all good 😊

My GitHub username is hilarioushappystar, which by some strange coincidence also happens to be my username on the Cracking the Cryptic discord server 😊

And now we digress …

Fun fact: if I really wanted to design the best possible AI for Spider Solitaire, I believe it would be something to do with neural networks, trained over a large number of games just like the well-known Alpha Go program (or equivalent programs for Chess). Of course I may not have the resources to achieve this. But I am reasonably satisfied with a simple AI given that (AFAIK) there are no existing AI programs that play the Royal Game without rot13(haqb).

The Unofficial Spider Solitaire Song

It’s unofficial – Spider Solitaire finally has a theme song. With my many talents it’s a wonder this didn’t happen sooner rather than later, but at least I can now rest with a clear conscience.

This was mainly thanks to last week’s foray into the wonderful world of GitHub. The user “leapfroglets” made a simple remake of Spider Solitaire server with some half-decent victory music which plays after successfully completing a game (of course, you can also cheat by cloning the repository and clicking the audio file “bg-music.mp3”). The project was completed in six weeks internship program at Leapfrog University Inc. It uses plain vanilla Javascript and HTML canvas for animations. Now all I have to do is find out the real identity of the programmer and thank him/her for being one of the Awesome People.

The original audio file can be played here.

Unfortunately, my pull request for added lyrics has not (yet) been accepted, so until I hear the good news the status of my Spider Solitaire song remains as “unofficial”. But if you really really really wanna sing along, here are the lyrics:

BTW, if you really like the walking baseline and have a friend whose voice can reach nearly two octaves below middle C you can also add the repeated phrase “Spider Solitaire is awesome Spider Solitaire is awesome” to your heart’s content.

Until next time, happy Spider Solitaire playing and may all your four-chord songs be awesome 😊

My Perilous Journey Into The GitHub Underworld

POP QUIZ: What could be more fun than me, two friends reading my blog and three blobs from the hit game Among Us participating in a multi-player game of Spider Solitaire?

Answer: Embarking on a perilous journey into the GitHub underworld.

Some time ago, I discovered that an “intelligent random move algorithm” can get a lousy win rate playing the Royal Game without rot13(haqb). By lousy, I mean around the vicinity of 6% – but lousy also means better than 0%. The basic gist of my algorithm is to repeatedly apply the following heuristics when choosing a sequence of moves:

  • basic tree search (i.e. examine N random different options, choose the best one)
  • static evaluation function (rewards in-suit builds, turnovers, moving suits to foundations etc)
  • multiple-move look-ahead, stopping as soon as at least one new card is turned over.
  • super-moves (a term borrowed from Freecell) to traverse the search space more efficiently

Apparently, this is enough to get around a 6% win ratio on random deals. I haven’t thoroughly tested this on a large number of data points, so this should be taken with a pinch of NaCl. But why not enlist the help of other Spider Solitaire or programming enthusiasts to boost the win rate even further?

So, I’ve decided to take the plunge and sign up for a GitHub account. Today, I have just created an account and uploaded my first project. My mad coding skillz aren’t nearly half as good as my mad Sudoku or Spider Solitaire skillz so hopefully I can keep my faux pas count to a minimum 😊 At least my actual job has provided me with some experience working with code repositories so I’m not totally out on a limb.

The current state of my project is a bare-bones AI, with plenty of room for improvement – a deliberate choice on my part. For instance, I explicitly stated in the readme.md file that one possible “to-do” is constructing a test suite to sanity-check that the AI scores a high/low/zero win rate for an easy/hard/impossible hand. I could have put in the effort to make the best possible AI before releasing it into the wild, but that’s not exactly in the spirit of collaboration with external programmers.

I already know at least one work colleague with a GitHub account and a number of “fun colleagues” from the Cracking the Cryptic discord server. With Christmas holidays around the corner (not to mention some flex leave arrangements) I should be in a good position to devote some serious time to this project.

I believe a good way to get some traction happening on my Spider Solitaire project is to return the favour for other people on GitHub. If I can spend some time “building reputation” (if you will), then I might find out that rot13(Xnezn nva’g n ovgpu) after all.

I have managed to make my first pull request for someone else’s code – a minor bug fix for a Spider Solitaire server written in JavaScript. There really should be a cheevo for submitting a meaningful pull request for someone else within 24 hours of creating a GitHub account – even if it doesn’t get accepted. But I’m sure there are worse injustices in the world I could wax lyrical about.

Interesting times ahead, watch this space 😊