Bespoke Labs
  • Welcome
  • BESPOKE CURATOR
    • Getting Started
      • Quick Tour
      • Key Concepts
      • Visualize your dataset with the Bespoke Curator Viewer
      • Automatic recovery and caching
      • Structured Output
    • Save $$$ on LLM inference
      • Using OpenAI for batch inference
      • Using Anthropic for batch inference
      • Using Gemini for batch inference
      • Using Mistral for batch inference
      • Using kluster.ai for batch inference
    • How-to Guides
      • Using vLLM with Curator
      • Using Ollama with Curator
      • Using LiteLLM with curator
      • Handling Multimodal Data in Curator
      • Executing LLM-generated code
      • Using HuggingFace inference providers with Curator
    • Data Curation Recipes
      • Generating a diverse QA dataset
      • Using SimpleStrat block for generating diverse data
      • Curate Reasoning data with Claude-3.7 Sonnet
      • Synthetic Data for function calling
    • Finetuning Examples
      • Aspect based sentiment analysis
      • Finetuning a model to identify features of a product
    • API Reference
  • Models
    • Bespoke MiniCheck
      • Self-Hosting
      • Integrations
      • API Service
    • Bespoke MiniChart
    • OpenThinker
Powered by GitBook
On this page
  1. Models
  2. Bespoke MiniCheck

API Service

PreviousIntegrationsNextBespoke MiniChart

Last updated 8 months ago

Using the api service is quite easy.

Step 1: API Key Setup

First, get your API key at the .

export BESPOKE_API_KEY=besoke-...

Step 2: Install Dependencies

Install the package:

pip install bespokelabs

Step 3: Run

import os
from bespokelabs import BespokeLabs

bl = BespokeLabs(
    # This is the default and can be omitted
    auth_token=os.environ.get("BESPOKE_API_KEY"),
)

response = bl.minicheck.factcheck.create(
    claim="claim",
    context="context",
)
print(response.support_prob)

Lot more information about the library is available at the .

Bespoke Console
bespokelabs pypi page