# Using Ollama with Curator

You can use **Ollama** as a backend for **Curator** to generate structured synthetic data. In this example, we will generate a list of countries and their capitals, but the approach can be adapted for any data generation task.

## **Prerequisites**

* **Python 3.10+**
* **Curator**: Install via `pip install bespokelabs-curator`
* **Ollama:** Download via <https://ollama.com/download>

## **Steps**

#### **1. Create a curator.LLM subclass**

Create a class that inherits from `curator.LLM`. Implement two key methods:

* `prompt()`: Generates the prompt for the LLM.
* `parse()`: Processes the LLM's response into your desired format.

Here’s the implementation:

```python
from bespokelabs import curator
from pydantic import BaseModel, Field

class Location(BaseModel):
    country: str = Field(description="The name of the country")
    capital: str = Field(description="The name of the capital city")

class LocationList(BaseModel):
    locations: list[Location] = Field(description="A list of locations")

class SimpleOllamaGenerator(curator.LLM):
    response_format = LocationList

    def prompt(self, input: dict) -> str:
        return "Return five countries and their capitals."

    def parse(self, input: dict, response: str) -> dict:
        return [{"country": output.country, "capital": output.capital} for output in response.locations]
```

### **2. Configure the Ollama Backend**

1. Start Ollama server with  `llama3.1:8b`model.

```bash
ollama pull llama3.1:8b
ollama serve
```

2. Initialize your generator with Ollama configuration:

```python
llm = SimpleOllamaGenerator(
    model_name="ollama/llama3.1:8b",  # Ollama model identifier
    backend_params={"base_url": "http://localhost:11434"},  # Ollama instance
)
```

### **3. Generate Data**

Generate the structured data and output the results as a pandas DataFrame:

```python
locations = llm()
print(locations.dataset.to_pandas())
```

### **Example Output**

Using the above example, the output might look like this:

| Country | Capital   |
| ------- | --------- |
| France  | Paris     |
| Japan   | Tokyo     |
| Germany | Berlin    |
| India   | New Delhi |
| Brazil  | Brasília  |

## **Ollama Configuration**

Use `base_url` in the `backend_params` to specify the connection URL.

Example:

```python
backend_params={"base_url": "http://localhost:11434"}
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.bespokelabs.ai/bespoke-curator/how-to-guides/using-ollama-with-curator.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
