Feather Posted December 20, 2019 Share Posted December 20, 2019 I didn't see much implementation of machine learning in Houdini so I wanted to give it a shot. Still just starting this rabbit hole but figured I'd post the progress. Maybe someone else out there is working on this too. First of all I know most of this is super inefficient and there are faster ways to achieve the results, but that's not the point. The goal is to get as many machine learning basics functioning in Houdini as possible without python libraries just glossing over the math. I want to create visual explanations of how this stuff works. It helps me ensure I understand what's going on and maybe it will help someone else who learns visually to understand. So... from the very bottom up the first thing to understand is Gradient Descent because that's the basic underlying function of a neural network. So can we create that in sops without python? Sure we can and it's crazy slow. On the left is just normal Gradient Descent. Once you start to iterate over more than 30 data points this starts to chug. So on the right is a Stochastic Gradient Descent hybrid which, using small random batches, fits the line using over 500 data points. It's a little jittery because my step size is too big but hey it works so.. small victories. Okay so Gradient Descent works, awesome, lets use it for some actual machine learning stuff right? The hello world of machine learning is image recognition of hand written digits using the MNIST dataset. MNIST is a collection of 60 thousand 28 by 28 pixel images of hand written digits. Each one has a label of what they're supposed to be so we can use it to train a network. The data is stored as a binary file so I had to use a bit of python to interpret the files but here it is. Now that I can access the data next is actually getting this thing to a trainable state. Still figuring this stuff out as I go so I'll probably post updates over the holiday weekend. in the mean time. anyone else out there playing with this stuff? 9 Quote Link to comment Share on other sites More sharing options...
Librarian Posted December 21, 2019 Share Posted December 21, 2019 I have fun with tutorials and files from . Just having fun. https://github.com/CppCon/CppCon2018/blob/master/Presentations/better_cpp_using_machine_learning_on_large_projects/better_cpp_using_machine_learning_on_large_projects__nicolas_fleury_mathieu_nayrolles__cppcon_2018.pdf https://pytorch.org/tutorials/intermediate/spatial_transformer_tutorial.html https://www.sidefx.com/forum/topic/60175/?page=1#post-269123 https://github.com/pedohorse Quote Link to comment Share on other sites More sharing options...
flcc Posted December 21, 2019 Share Posted December 21, 2019 (edited) Here are also some links All coded in processing (java) easy to translate in vex. a good basic explanation by daniel shiffman (processing guru) https://natureofcode.com/book/chapter-10-neural-networks/ code available here https://github.com/nature-of-code/noc-examples-processing/tree/master/chp10_nn An excellent tutorial by charles Fried (three parts) https://medium.com/typeme/lets-code-a-neural-network-from-scratch-part-1-24f0a30d7d62 Code available here : https://github.com/CharlesFr/ANN_Tutorial And a link to download an ascii csv version of the mnist database https://pjreddie.com/projects/mnist-in-csv/ @Feather : A quick question : How do you manage the OOP aspect of these code. Struct ? translate it to functional programming ? Another tricks ? Edited December 21, 2019 by flcc 1 Quote Link to comment Share on other sites More sharing options...
Feather Posted January 11, 2020 Author Share Posted January 11, 2020 (edited) Glad I'm not alone in enjoying this stuff. Thanks for the videos guys! This update took a lot longer than I thought it wanted to give a slight preview. I had to go back and learn some of the maths behind this stuff to really breakdown what a lot of the python scripts were doing so I can rebuild this network in a visual way. n case anyone wants to understand the math going on behind these networks, a really good resource is youtube channel 3Blue1Brown. He has an entire series on calculus and another short series on neural networks. If you're missing that foundations in linear algebra you can watch another series by a youtuber named george soilis. At first I thought I could get away with something similar to the video's I had been watching that used aggregate sums to define the value of each neuron. Unfortunately that doesn't give quite the intuitive result I was looking for so... introducing neural net 2.0 below. It's not 100% done but once its finished you'll be able to watch literally every single vector change as each neuron learns. Edited January 11, 2020 by Feather 3 Quote Link to comment Share on other sites More sharing options...
Jason Posted January 12, 2020 Share Posted January 12, 2020 This is interesting development, Vlad - would love to see where you take it! 1 Quote Link to comment Share on other sites More sharing options...
Feather Posted January 12, 2020 Author Share Posted January 12, 2020 As I'm going through the maths, because each of the inputs is actually a neuron with its own bias and weight to consider, the following image is a better representation of whats actually happening. The boxes above are part of the mini-batch process and difficult to show at this scale. 1 Quote Link to comment Share on other sites More sharing options...
m.shahmardani Posted January 12, 2020 Share Posted January 12, 2020 It is interesting. In this way I think you can build DNN using convolution and pooling layers also. Quote Link to comment Share on other sites More sharing options...
Librarian Posted January 12, 2020 Share Posted January 12, 2020 Aha i see now 1 Quote Link to comment Share on other sites More sharing options...
Feather Posted January 12, 2020 Author Share Posted January 12, 2020 @LibrarianThe real time thing is awesome, that guy also did some work in Houdini with evolution networks as well and posted it to his vimeo. A Convolution Network is certainly to follow once I have this working properly. 1 Quote Link to comment Share on other sites More sharing options...
Rubs Posted October 8, 2021 Share Posted October 8, 2021 I just bumped into this thread today... great, it turns out I am not that crazy https://www.rubdhz.xyz/exploring-neural-networks-in-houdini Really glad to see that there are more people building Machine Learning toys in Houdini! Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.