10.3 C
New York
Thursday, March 6, 2025

Portkey: An open-source AI gateway for easy LLM orchestration




from portkey_ai import Portkey
import os

lb_config = {
    "strategy": { "mode": "loadbalance" },
    "targets": [{
        "provider": 'openai',
        "api_key": os.environ["OPENAI_API_KEY"],
        "weight": 0.1
    },{
        "provider": 'groq',
        "api_key": os.environ["GROQ_API_KEY"],
        "weight": 0.9,
        "override_params": {
            "model": 'llama3-70b-8192'
        },
    }],
}

client = Portkey(config=lb_config)

response = client.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}],
    model="gpt-4o-mini"
)

print(response.choices[0].message.content)

Implementing conditional routing:


from portkey_ai import Portkey
import os

openai_api_key = os.environ["OPENAI_API_KEY"]
groq_api_key = os.environ["GROQ_API_KEY"]

pk_config = {
    "strategy": {
        "mode": "conditional",
        "conditions": [
            {
                "query": {"metadata.user_plan": {"$eq": "pro"}},
                "then": "openai"
            },
            {
                "query": {"metadata.user_plan": {"$eq": "basic"}},
                "then": "groq"
            }
        ],
        "default": "groq"
    },
    "targets": [
        {
            "name": "openai",
            "provider": "openai",
            "api_key": openai_api_key
        },
        {
            "name": "groq",
            "provider": "groq",
            "api_key": groq_api_key,
            "override_params": {
                "model": "llama3-70b-8192"
            }
        }
    ]
}

metadata = {
    "user_plan": "pro"
}

client = Portkey(config=pk_config, metadata=metadata)

response = client.chat.completions.create(
    messages=[{"role": "user", "content": "What's the meaning of life?"}]
)
print(response.choices[0].message.content)

The above example uses the metadata value user_plan to determine which model should be used for the query. This is useful for SaaS providers who offer AI through a freemium plan.

Harnessing Portkey AI Gateway for LLM integration

Portkey represents a significant innovation in LLM integration. It addresses critical challenges in managing multiple providers and optimizing performance. By providing an open-source framework that enables seamless interaction with various LLM providers, the project fills a significant gap in current AI development workflows.

The project thrives on community collaboration, welcoming contributions from developers worldwide. With an active GitHub community and open issues, Portkey encourages developers to participate in expanding its capabilities. The project’s transparent development approach and open-source licensing make it accessible for both individual developers and enterprise teams.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles