## 📄️ What is Automatic Differentiation (Autograd)?

Automatic differentiation, often referred to as autograd, is a technique used to evaluate the derivatives of functions specified by computer programs. Unlike numerical differentiation, which approximates derivatives using finite differences, or symbolic differentiation, which manipulates mathematical expressions directly, automatic differentiation computes exact derivatives efficiently through a process of program transformation.

## 📄️ Installing Autograd

NumPower Autograd is available through composer and uses the NumPower extension and

## 📄️ Basic usage

On this page we will see how we can use the Autograd library in a simplified way and

## 📄️ Simple Neural Net from scratch using Autograd

In this section, we'll introduce the concept of automatic differentiation (autograd) by implementing a simple