OpenAI Agents SDK
The OpenAI Agents SDK can be pointed at any OpenAI-compatible endpoint by configuring a custom provider.
Install
pip install openai-agentsConfigure
Create an OpenAIProvider that points at the Doubleword API and pass it via RunConfig:
from openai import AsyncOpenAI
from agents import Agent, Runner, RunConfig
from agents.models.openai_provider import OpenAIProvider
provider = OpenAIProvider(
openai_client=AsyncOpenAI(
base_url="https://api.doubleword.ai/v1",
api_key="{{apiKey}}",
),
)
agent = Agent(
name="my-agent",
model="{{selectedModel.id}}",
instructions="You are a helpful assistant.",
)
import asyncio
result = asyncio.run(
Runner.run(
agent,
"Say hello.",
run_config=RunConfig(model_provider=provider),
)
)
print(result.final_output)The Doubleword API supports both the Responses API and the Chat Completions API, so the SDK works with its default settings.
Why OpenAIProvider instead of set_default_openai_client
The Agents SDK also offers set_default_openai_client as a simpler global configuration:
from agents import set_default_openai_client
client = AsyncOpenAI(
base_url="https://api.doubleword.ai/v1",
api_key="{{apiKey}}",
)
set_default_openai_client(client, use_for_tracing=False)This works when model names follow the provider/model convention the SDK expects (e.g. openai/gpt-4o). However, it will fail for model names that contain a / that doesn't match a known provider prefix — for example Qwen/Qwen3-30B or meta-llama/Llama-3.1-8B.
The OpenAIProvider approach avoids this issue entirely because it bypasses the SDK's built-in provider routing and sends the model name directly to your endpoint.
Key options
use_for_tracing=False— if using theset_default_openai_clientapproach, set this toFalsesince the Doubleword API does not implement OpenAI's tracing API.use_responses=False— optionally forces the SDK to use/chat/completionsinstead of the Responses API. Not needed with Doubleword since both are supported.