Gradient descent is an iterative method for minimising a multivariable [[Function|function]] $G(\vec{x})$ through starting at an initial guess $\vec{x}_0$ and using the derivative of the function at $G'(\vec{x}_0)$ to follow the path of steepest descent to arrive at a better guess repeatedly, until $G'(\vec{x}_n) \approx \vec{0}$, where $\vec{x}_n$ is a local minimum of $G$. While the method is an efficient way of finding the minimum of a function, the algorithm may be stuck at a local minimum and fail to arrive at the absolute minimum of a function. Gradient descent will be used to allow economic actors to optimise their utility as a multivariable function of the various types of goods they consume, and as a result make decisions on what goods to purchase from the market to create demand.