Jump to content
Feather

Machine learning in Houdini because why not?

Recommended Posts

I didn't see much implementation of machine learning in Houdini so I wanted to give it a shot. Still just starting this rabbit hole but figured I'd post the progress. Maybe someone else out there is working on this too. 

First of all I know most of this is super inefficient and there are faster ways to achieve the results, but that's not the point. The goal is to get as many machine learning basics functioning in Houdini as possible without python libraries just glossing over the math. I want to create visual explanations of how this stuff works. It helps me ensure I understand what's going on and maybe it will help someone else who learns visually to understand. 

So... from the very bottom up the first thing to understand is Gradient Descent because that's the basic underlying function of a neural network. So can we create that in sops without python? Sure we can and it's crazy slow.

On the left is just normal Gradient Descent. Once you start to iterate over more than 30 data points this starts to chug. So on the right is a Stochastic Gradient Descent hybrid which, using small random batches, fits the line using over 500 data points. It's a little jittery because my step size is too big but hey it works so.. small victories.

Gradient_Descent_Houdini.gif.b4c41008b37618404261cbbbae8eeae8.gif

 

Okay so Gradient Descent works, awesome, lets use it for some actual machine learning stuff right? 

The hello world of machine learning is image recognition of hand written digits using the MNIST dataset. MNIST is a collection of 60 thousand 28 by 28 pixel images of hand written digits. Each one has a label of what they're supposed to be so we can use it to train a network.

The data is stored as a binary file so I had to use a bit of python to interpret the files but here it is.

 

 Houdini_MNIST.gif.cbdf4ed8312b932bfb9d09289434bf85.gif

 

Now that I can access the data next is actually getting this thing to a trainable state. Still figuring this stuff out as I go so I'll probably post updates over the holiday weekend.

in the mean time. anyone else out there playing with this stuff?

  • Like 5

Share this post


Link to post
Share on other sites

Here are also some links
All coded in processing (java) easy to translate in vex.

a good basic explanation by daniel shiffman (processing guru)
https://natureofcode.com/book/chapter-10-neural-networks/

code available here
https://github.com/nature-of-code/noc-examples-processing/tree/master/chp10_nn

An excellent tutorial by charles Fried (three parts)
https://medium.com/typeme/lets-code-a-neural-network-from-scratch-part-1-24f0a30d7d62
NN.JPG

Code available here :
https://github.com/CharlesFr/ANN_Tutorial

And a link to download an ascii csv version of the mnist database
https://pjreddie.com/projects/mnist-in-csv/

@Feather : A quick question : How do you manage the OOP aspect of these code.
Struct ?
translate it to functional programming ?
Another tricks ?

Edited by flcc

Share this post


Link to post
Share on other sites

Glad I'm not alone in enjoying this stuff. Thanks for the videos guys!

 

This update took a lot longer than I thought it wanted to give a slight preview. I had to go back and learn some of the maths behind this stuff to really breakdown what a lot of the python scripts were doing so I can rebuild this network in a visual way.

n case anyone wants to understand the math going on behind these networks, a really good resource is youtube channel 3Blue1Brown. He has an entire series on calculus and another short series on neural networks. If you're missing that foundations in linear algebra you can watch another series by a youtuber named george soilis.

 

At first I thought I could get away with something similar to the video's I had been watching that used aggregate sums to define the value of each neuron. Unfortunately that doesn't give quite the intuitive result I was looking for so... introducing neural net 2.0 below. 

It's not 100% done but once its finished you'll be able to watch literally every single vector change as each neuron learns. 

update2.thumb.jpg.af69c473cb10a877523e8cae2a8e33bf.jpgupdate.thumb.jpg.cbc22399c2f50d3238b1d7626db4682a.jpg

Edited by Feather
  • Like 1

Share this post


Link to post
Share on other sites

This is interesting development, Vlad - would love to see where you take it!

  • Like 1

Share this post


Link to post
Share on other sites

As I'm going through the maths, because each of the inputs is actually a neuron with its own bias and weight to consider, the following image is a better representation of whats actually happening. 

The boxes above are part of the mini-batch process and difficult to show at this scale. 

Better.thumb.jpg.a58a71f290c19ef902689b9876dd6cbe.jpg

Share this post


Link to post
Share on other sites

@LibrarianThe real time thing is awesome, that guy also did some work in Houdini with evolution networks as well and posted it to his vimeo. 

A Convolution Network is certainly to follow once I have this working properly. :)

  • Like 1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×