If you are unfamiliar with the definition of gradient, you could learn about it in the following video of Khan Academy.
In essence, the gradient of a function is a vector whose entries are derivatives of that function with respect to each input variable.
Giving a function $f(x, y)$, where $x$ and $y$ are input variables. The gradient of $f$ is $\nabla f$ ($\nabla$ is pronounced as ‘del’).
$$ \nabla f = \begin{bmatrix} \frac{\partial f}{\partial x} & \frac{\partial f}{\partial y} \\ \end{bmatrix} $$
Just an array of numbers, but even more interesting than that!