Skip to content

Providers

ArtemisKit supports multiple LLM providers through adapters. Choose the one that fits your needs.

ProviderPackageBest For
OpenAI@artemiskit/adapter-openaiGPT models, production
Azure OpenAI@artemiskit/adapter-openaiEnterprise, compliance
OpenAI-Compatible@artemiskit/adapter-openaiGroq, Together, Ollama, etc.
Anthropic@artemiskit/adapter-anthropicClaude models
Vercel AI@artemiskit/adapter-vercel-aiMulti-provider apps
Terminal window
export OPENAI_API_KEY="sk-..."
provider: openai
model: gpt-5
Terminal window
export ANTHROPIC_API_KEY="sk-ant-..."
provider: anthropic
model: claude-sonnet-4-5-20241022
Terminal window
export AZURE_OPENAI_API_KEY="..."
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"
provider: azure-openai
model: gpt-5
providers:
azure-openai:
apiKey: ${AZURE_OPENAI_API_KEY}
resourceName: ${AZURE_OPENAI_RESOURCE_NAME}
deploymentName: ${AZURE_OPENAI_DEPLOYMENT_NAME}
Terminal window
export OPENAI_API_KEY="sk-..." # or other provider keys
provider: vercel-ai
model: openai:gpt-4o # format: provider:model

OpenAI-Compatible (Groq, Together, Ollama, etc.)

Section titled “OpenAI-Compatible (Groq, Together, Ollama, etc.)”
Terminal window
export GROQ_API_KEY="gsk_..." # or other provider key
provider: openai
model: llama-3.3-70b-versatile
providers:
openai:
apiKey: ${GROQ_API_KEY}
baseUrl: https://api.groq.com/openai/v1

See OpenAI-Compatible Providers for Groq, Together AI, Fireworks, OpenRouter, Ollama, LM Studio, and vLLM setup guides.

FeatureOpenAIAzureAnthropicVercel AI
StreamingYesYesYesYes
Function CallingYesYesYesVaries
VisionYesYesYesVaries
JSON ModeYesYesNoVaries
Max Context128K128K200KVaries

Settings are applied in this order (highest to lowest priority):

  1. CLI flags - --provider, --model
  2. Scenario file - provider:, model:
  3. Config file - artemis.config.yaml
  4. Environment variables
  5. Defaults