- Graph Databases
- Graph Signal Processing
- Matrix / GNU Octave Basics
- "Memory" in Neural Networks
- Performance Measures (Metrics)
To fully understand the basics of machine learning, you need to study at least a bit of linear algebra (matrix mathematics: transformations, multiplications, …), partial derivatives (needed for the chain rule, used in backpropagation, …), etc. If you are at all serious about machine learning, Andrew Ng’s Machine Learning course is a must, if only to understand the basics.
Here are my old notes from that course (2013; some of the images are missing, but those omissions are trivial as these notes are very comprehensive):
As well, here is Chris Olah’s “Backpropagation blog post, with my handwritten partial differentiation calculations.