Gradient Descent
An optimization algorithm used by machine learning models, including AI agents, to minimize errors during training.
Gradient descent adjusts the model’s parameters iteratively to reduce the difference between predicted and actual outcomes, improving the agent's accuracy in tasks such as classification or prediction.