mark my words

athiest father preying for a secular future

Funny AI Book Recommendation

On Goodreads:

Because I liked a book about Bill Clinton published in the 90s, I might like an advanced poker book?!  Funny thing is I’m pretty sure I’ve read that book too, so it is a good suggestion.

June 9, 2017 Uncategorized

LSTM Cells

I haven’t quite got to LSTM in my book yet, but they were talkinga bout them in the TensorFlow Dev Summit.
I found this neato resource

The LSTM’s really remind me of logic gates.  Especially flip-flop, where they put together some NOR gates to make memory.

Using a series of LSTM’s might be good for parsing the entire bidding auction.  The bidding is kind of like a sentence.

February 17, 2017 AI Bridge Project

TensorFlow Summit

This tool is pretty neat.  It lets you visualize your data.

Data Map

It would be neat if I could map different bridge hands into 3D space for display

February 16, 2017 AI Bridge Project

Neat Images

I thought this article had some neat images!



AI Bridge Project

TenserFlow 1.0

Looks like there is a new release of TensorFlow.  I see there is an 8 hour video!  Looks like good watching, perhaps this weekend?

AI Bridge Project

Well I guess someone beat me to the punch

I found an amazing journal entry explaining exactly what I wanted to do!  They even did a better job than I was thinking.

They got around the ‘double dummy’ by dealing 5 hands.  A pretty good idea.  They also used gradient decent based on the score.  I wonder if they are calculating double dummy throughout their learning loop.

A couple of things that could be added, they don’t seem to take into consideration opponent’s bidding nor vulnerability.  They also didn’t partition their DNN based on suits, which I think would give the model a big head start.

But really cool work.


AI Bridge Project

Deep Learning

This Deep learning book is turning out to be very technical.  But it is terrific!  It explains in technical detail exactly what each function does and the reasoning behind it.

My purpose of learning this deep learning is the develop a bridge bidding system using raw data.  Essentially, don’t worry about human conventions at all.  Also, don’t tell the computer it has a partner, let it just find the best score based off the information it has.

This insight has allowed me to think of other problems.  Eg, When solving double dummy hands, the max number of tricks you make, might be different than the goal of number of tricks you need.  Thankfully the deal program can tell you how many tricks you can take via a greedy line, and how many tricks you can take via a conservative line.

I think the ideas of bidding conventions might be a bit complex.  It might be best to have a computer actually bid what they think they can make.  I obviously can’t create machine learning for each bidding sequence.  So I think the computer should only take the last bid into consideration.  What happens if the opponents are playing a conventional bid, then it would be stupid to try to infer anything other than that could be the last bid they make!  I imagine it will emerge that all the computer agents have a huge advantage to bid what they actually have.  Both for descriptive reasons and for pre-emptive reasons.

A shortcut I could use is once I have a deal, do learning with each hand.  But this might really bias the data so I think I won’t do that. (Eg. 1 deal = 4 learning iterations)

Besides the book talking about softmax functions,which I think I can use to assign a probability of making each contract times the ‘expected value’, the scored reward of actually making the contract.

The model that seems to be pretty universal is a DNN with linear regression on the inputs, a convolution network on each suit, and a couple of relu layers.  Then a softmax expected value.  It would be neat to use gradient descent on the calculated score of the different bids.  But I’ll likely just use the default L2.

So that is my best beginner approach.

February 14, 2017 AI Bridge Project

Next on the list

A neat primer!



February 3, 2017 AI Bridge Project

First bidding Tensorflow



So, I gave my neural network 2 hands, the first one it says I should pass and the second I should bid 18.

What is all that red stuff? I guess the syntax in the example is depreciated. I’ll have to see if I can fix that.

February 1, 2017 AI Bridge Project


So it looks like Tensor flow has a highlevel API tf.contrib.learn which should be easier than the example I looked at yesterday.

It has a training dataset using IRIS (the flower), whereby data is imported via CSV file.  So I’ve formatted a few CSV files, and I’ll see if I can get my first run through.

I’ve formatted them the same as the IRIS data:


120 4 setosa versicolor virginica
6.4 2.8 5.6 2.2 2
5 2.3 3.3 1 1
4.9 2.5 4.5 1.7 2
4.9 3.1 1.5 0.1 0
5.7 3.8 1.7 0.3 0
4.4 3.2 1.3 0.2 0
5.4 3.4 1.5 0.4 0
6.9 3.1 5.1 2.3 2

It looks like 120 cause its got 120 records, 4 because there are 4 datapoints and then 3 because of the output of three different types of flower.

My data looks like this:







The 8 and the 23 were the highest scoring contracts given that hand for North. (1 diamond and 2spades)

Then I’ll call the sample python script and alter the querry data to see what it things I should make for a contract.

new_samples = np.array(

[[0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,1,0,1,0,0,0,1,1,0,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,0,0], [0,1,0,1,1,0,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,1,1,0,0,0,0,0,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0]], dtype=float)

AI Bridge Project