Docs
Quick Start

Quick Start

How to integrate with the Pragmatum proxy?

Create a Provider

Go to your Pragmatum console's credentials page (opens in a new tab) and create an OpenAI provider by entering your OpenAI API key. Don't worry, we store your key securely.

Create a Pragmatum API key

Then, create a Pragmatum API key using the provider you just created. Copy your Pragmatum API Key. You will need to replace your OpenAI key with it to call the proxy in the next step.

At the end, your screen should look like this (1 provider and 1 api key created).

Integration

Using the API Key from the previous step, you can integrate with different LLM provider clients. Below are integration examples with different languages and clients.

Integration (Python)

from openai import OpenAI
client = OpenAI(
    api_key="pragmatum API key from step 2", # if this isn't set, you need to replace your OPENAI_API_KEY env var with the Pragmatum Api Key instead,
    base_url="https://proxy.pragmatum.com/api/providers/openai/v1",
)

Integration (Javascript/Typescript)

import OpenAI from 'openai';
 
const openai = new OpenAI({
  apiKey: "pragmatum API key from step 2", // if this isn't set, you need to replace your OPENAI_API_KEY env var with the Pragmatum Api Key instead
  baseURL: "https://proxy.pragmatum.com/api/providers/openai/v1", // redirect to pragmatum proxy
});

You're done! Observe your LLM usage

Redeploy your application and go to the interactions page (opens in a new tab) to see your queries. You get logging, caching, failover, and security signals out of the box.

For more details on the Proxy, please refer to our Proxy Guide