This guide demonstrates how to use LiteLLM as a backend for curator to generate synthetic data using various LLM providers. We'll walk through an example of generating synthetic recipes, but this approach can be adapted for any synthetic data generation task.
Prerequisites
Python 3.10+
Curator (pip install bespokelabs-curator)
Access to an LLM provider (e.g., Gemini API key)
Steps
1. Create a curator.LLM Subclass
First, create a class that inherits from curator.LLM. You'll need to implement two key methods:
prompt(): Generates the prompt for the LLM
parse(): Processes the LLM's response into your desired format
"""Generate synthetic recipes for different cuisines using curator."""
from datasets import Dataset
from bespokelabs import curator
class RecipeGenerator(curator.LLM):
"""A recipe generator that generates recipes for different cuisines."""
def prompt(self, input: dict) -> str:
"""Generate a prompt using the template and cuisine."""
return f"Generate a random {input['cuisine']} recipe. Be creative but keep it realistic."
def parse(self, input: dict, response: str) -> dict:
"""Parse the model response along with the input to the model into the desired output format.."""
return {
"recipe": response,
"cuisine": input["cuisine"],
}
2. Set Up Your Seed Dataset
Create a dataset of inputs using the HuggingFace Dataset class:
# List of cuisines to generate recipes for
cuisines = [
{"cuisine": cuisine}
for cuisine in [
"Chinese",
"Italian",
"Mexican",
"French",
"Japanese",
"Indian",
"Thai",
"Korean",
"Vietnamese",
"Brazilian",
]
]
cuisines = Dataset.from_list(cuisines)
3. Configure LiteLLM Backend
Initialise your generator with LiteLLM configuration: