What are Tensor Diagrams?
Machine learning involves a lot of tensor manipulation, and it's easy to lose track of the bigger picture when manipulating high-dimensional data using notation designed for vectors and matrices.
It turns out all the trouble with tensors disappears when you instead represent them using graphs:
This book aims to standardize the notation for tensor diagrams by rewriting the classical "Matrix Cookbook" using this notation.
Tensor diagrams are better than alternative notation like Index Notation (einsum) because they:
- Make it easy to spot patterns and symmetries
- Avoid all trouble with vectorization and Kronecker products
- Make Matrix Calculus simple and intuitive
- Represent functions and broadcasting effortlessly
Tensorgrad
is a python library for symbolic tensor manipulation and derivatives using tensor diagrams. Try it here:
xxxxxxxxxxb, x, y = sp.symbols("b x y")X = tg.Variable("X", b, x)Y = tg.Variable("Y", b, y)W = tg.Variable("W", x, y)error = X @ W - Yloss = error @ errorexpr = tg.Derivative(loss, W)save_steps(expr)