A minimal deep learning library.
Modern deep learning frameworks are too big. PyTorch and TensorFlow are hundreds of thousands of lines of code; their immense complexity makes adding new features like hardware acceleration difficult.
Nanograd is simple. Less code. Fewer operations. Built with Rust 🦀.
Nanograd represents networks as a dynamic DAG. Optimization is handled by backpropagation. However, the DAG only operates over scalar values. This means every neuron is separated into many add and multiply operations: simple, but slow.
The next version of nanograd will swap scalars with tensors.
The playground allows people to experiment with neural networks in the browser. Change hyperparameters, select a dataset, and add layers: all from a slick interface.
Training requests are fulfilled by nanograd. Since nanograd is written in Rust, the code is first compiled into a WebAssembly Module (WASM).
Model preferences are shared with a web worker, and the nanograd WASM executes on a separate thread. Results are sent back to the primary thread and displayed.