Quick definition
PEFT stands for parameter-efficient fine-tuning.
- Category: Training
- Focus: model adaptation
- Used in: Adapting a base model to your domain or style.
What it means
It reduces compute by training only a small subset of parameters. In training workflows, peft often shapes model adaptation.
How it works
Training adapts models through fine-tuning or preference optimization. It uses curated datasets and evaluation loops.
Why it matters
Training methods tailor models to your domain and use case.
Common use cases
- Adapting a base model to your domain or style.
- Improving instruction following for specific tasks.
- Reducing errors with better training data.
Example
Use LoRA or adapters instead of full fine-tuning.
Pitfalls and tips
Low-quality data can degrade performance. Keep datasets clean, representative, and well-labeled.
In BoltAI
In BoltAI, this is referenced when discussing model customization.