Generation
Temperature
Temperature controls randomness in text generation.
Quick definition
Temperature controls randomness in text generation.
- Category: Generation
- Focus: output style and randomness
- Used in: Lower randomness for precise, repeatable answers.
What it means
Higher values produce more diverse outputs, lower values are more deterministic. In generation workflows, temperature often shapes output style and randomness.
How it works
Generation settings control how the model samples tokens. They trade off creativity, determinism, and safety.
Why it matters
Generation settings trade off creativity, determinism, and safety.
Common use cases
- Lower randomness for precise, repeatable answers.
- Higher randomness for brainstorming and creative tasks.
- Stopping rules to end output at the right time.
Example
Use 0.2 for precise answers, 0.8 for creativity.
Pitfalls and tips
High randomness can reduce accuracy while low randomness can be repetitive. Tune per task and evaluate results.
In BoltAI
In BoltAI, this appears in model settings that shape responses.