human-feedback - Concepts Explore concepts tagged with "human-feedback" Total concepts: 1 Concepts Reinforcement Learning from Human Feedback (RLHF) - A training technique that aligns LLM outputs with human preferences by using human feedback to guide model behavior. ← Back to all concepts