DoublewordDoubleword

CrewAI

CrewAI uses LiteLLM under the hood, so it supports custom OpenAI-compatible endpoints out of the box.

Install

pip install crewai litellm

CrewAI delegates all LLM calls to LiteLLM. Depending on your environment, litellm may not be pulled in automatically — install it explicitly to be safe.

Configure

from crewai import Agent, Task, Crew, LLM

llm = LLM(
    model="openai/{{selectedModel.id}}",
    base_url="https://api.doubleword.ai/v1",
    api_key="{{apiKey}}",
)

agent = Agent(
    role="Assistant",
    goal="Answer questions",
    backstory="A helpful assistant.",
    llm=llm,
)

task = Task(
    description="Say hello.",
    expected_output="A greeting.",
    agent=agent,
)

crew = Crew(agents=[agent], tasks=[task])
result = crew.kickoff()
print(result)

The model string must be prefixed with openai/ so LiteLLM routes it through the OpenAI-compatible code path. The part after the slash should match a model name configured in your Control Layer.