Integrations
The Doubleword Inference API is OpenAI-compatible, so it works with any framework that supports custom OpenAI endpoints.
The guides below show how to configure each framework to route requests through the API.
| Framework | Language | Integration |
|---|---|---|
| OpenAI Agents SDK | Python | Custom provider |
| PydanticAI | Python | OpenAIProvider with custom base_url |
| CrewAI | Python | LiteLLM with openai/ model prefix |
| smolagents | Python | OpenAIServerModel with api_base |
| Google ADK | Python | LiteLLM with openai/ model prefix |
| Microsoft Agent Framework | Python | OpenAIChatCompletionClient with base_url |
| atomic-agents | Python | Instructor-wrapped OpenAI client |
| Agno | Python | OpenAILike model class |
| Mastra | TypeScript | @ai-sdk/openai-compatible provider |
| OpenClaw | Agent skill | Doubleword skill for async inference |