πŸ–₯️Running Models Locally

There are various ways to run models locally and create OpenAI compatible endpoints. Some of the example open-source libraries are:

If you are planning to run the models on consumer grade personal computers, you may want to use models in GGUF format. Read this discussion on conversionarrow-up-right.

Remotely Hosted Models

You can also input OpenAI compatible Base URL and API key of any remotely hosted service, however this is not recommended for sensitive data. For example:

circle-exclamation
  • OpenAI: https://api.openai.com/v1/

  • Gemini: https://my-openai-gemini-henna.vercel.app/v1

(Note: Cannot fetch models, defaults to Gemini-1.5-pro)

  • Groq: https://api.groq.com/openai/v1

Last updated