# 2021-12-21

**Day 3 of fastai**. The
book
arrived today and I'm following along from the book directly. A combination of
reading the paper book and using the Jupyter notebooks found in the GitHub
book repo are a potent combination.
Reminder that you can run the book repo in just a single step using
ez and VS Code.

The Universal Approximation Theorem undergirds the theoretical basis for neural networks being able to compute arbitrary functions. The two parts of the approximation theorem look at the limits of a single layer with an arbitrary number of neurons ("arbitrary width") and the limits of a network with an arbitrary number of hidden layers ("arbitrary depth").

21 December 2021