Reward Shaping
A reinforcement learning technique where the reward signals given to an AI agent are carefully designed to encourage certain behaviors and discourage others.
Reward shaping helps guide the agent toward the desired outcomes more efficiently, especially when the task environment is complex or uncertain.