Docs
⚡ Quick Start
Custom Endpoints

Custom Endpoints

RumiTalk supports OpenAI API compatible services using the rumitalk.yaml configuration file.

This guide assumes you have already set up RumiTalk using Docker, as shown in the Local Setup Guide.

ℹ️
Configuration Files Clarification

RumiTalk uses several configuration files, each with specific purposes:

  1. rumitalk.yaml - Used for custom endpoints configuration and other application settings
  2. .env file - Used for server configuration, pre-configured endpoint API keys, and authentication settings
  3. docker-compose.override.yml - Used for Docker-specific configurations and mounting volumes

Step 1. Create or Edit a Docker Override File

  • Create a file named docker-compose.override.yml file at the project root (if it doesn’t already exist).
  • Add the following content to the file:
services:
  api:
    volumes:
    - type: bind
      source: ./rumitalk.yaml
      target: /app/rumitalk.yaml

Learn more about the Docker Compose Override File here.

Step 2. Configure rumitalk.yaml

  • Create a file named rumitalk.yaml at the project root (if it doesn’t already exist).

  • Add your custom endpoints: you can view compatible endpoints in the AI Endpoints section.

    • The list is not exhaustive and generally every OpenAI API-compatible service should work.
    • There are many options for Custom Endpoints. View them all here: Custom Endpoint Object Structure.
  • As an example, here is a configuration for both OpenRouter and Ollama:

    version: 1.1.4
    cache: true
    endpoints:
      custom:
        - name: "OpenRouter"
          apiKey: "${OPENROUTER_KEY}"
          baseURL: "https://openrouter.ai/api/v1"
          models:
            default: ["gpt-3.5-turbo"]
            fetch: true
          titleConvo: true
          titleModel: "current_model"
          summarize: false
          summaryModel: "current_model"
          forcePrompt: false
          modelDisplayLabel: "OpenRouter"
        - name: "Ollama"
          apiKey: "ollama"
          baseURL: "http://host.docker.internal:11434/v1/"
          models:
            default: [
              "llama3:latest",
              "command-r",
              "mixtral",
              "phi3"
              ]
            fetch: true # fetching list of models is not supported
          titleConvo: true
          titleModel: "current_model"
⚠️
Important: API Key Configuration

When configuring API keys in custom endpoints, you have two options:

  1. Environment Variable Reference: Use ${VARIABLE_NAME} syntax to reference a variable from your .env file (recommended for security)

    apiKey: "${OPENROUTER_KEY}"
  2. User Provided: Set to "user_provided" (without the $ syntax) to allow users to enter their own API key through the web interface

    apiKey: "user_provided"
  3. Direct Value: Directly include the API key in the configuration file (not recommended for security reasons)

    apiKey: "your-actual-api-key"

This is different from pre-configured endpoints in the .env file where you would set ENDPOINT_KEY=user_provided (e.g., OPENAI_API_KEY=user_provided).

Step 3. Configure .env File

  • Edit your existing .env file at the project root
    • Copy .env.example and rename to .env if it doesn’t already exist.
  • According to the config above, the environment variable OPENROUTER_KEY is expected and should be set:
OPENROUTER_KEY=your_openrouter_api_key

Notes:

  • As way of example, this guide assumes you have setup Ollama independently and is accessible to you at http://host.docker.internal:11434
    • ”host.docker.internal” is a special DNS name that resolves to the internal IP address used by the host.
    • You may need to change this to the actual IP address of your Ollama instance.
  • In a future guide, we will go into setting up Ollama along with RumiTalk.

Step 4. Run the App

  • Now that your files are configured, you can run the app:
docker compose up

Or, if you were running the app before, you can restart the app with:

docker compose restart

Note: Make sure your Docker Desktop or Docker Engine is running before executing the command.

Conclusion

That’s it! You have now configured Custom Endpoints for your RumiTalk instance.

Additional Links

Explore more about RumiTalk and how to configure it to your needs.