Self-Hosting
You can access the model here: https://huggingface.co/bespokelabs/Bespoke-Minicheck-7B
Feel free to use this colab which uses the MiniCheck library that supports automated chunking of long documents.
Or, you can host the model directly on vLLM with docker as follows:
Please contact us for commercial licensing.
Last updated