Providers
Provider Adapters
Section titled “Provider Adapters”ArtemisKit supports multiple LLM providers through adapters. Choose the one that fits your needs.
Available Providers
Section titled “Available Providers”| Provider | Package | Best For |
|---|---|---|
| OpenAI | @artemiskit/adapter-openai | GPT models, production |
| Azure OpenAI | @artemiskit/adapter-openai | Enterprise, compliance |
| OpenAI-Compatible | @artemiskit/adapter-openai | Groq, Together, Ollama, etc. |
| Anthropic | @artemiskit/adapter-anthropic | Claude models |
| Vercel AI | @artemiskit/adapter-vercel-ai | Multi-provider apps |
Quick Setup
Section titled “Quick Setup”OpenAI
Section titled “OpenAI”export OPENAI_API_KEY="sk-..."provider: openaimodel: gpt-5Anthropic
Section titled “Anthropic”export ANTHROPIC_API_KEY="sk-ant-..."provider: anthropicmodel: claude-sonnet-4-5-20241022Azure OpenAI
Section titled “Azure OpenAI”export AZURE_OPENAI_API_KEY="..."export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com"provider: azure-openaimodel: gpt-5
providers: azure-openai: apiKey: ${AZURE_OPENAI_API_KEY} resourceName: ${AZURE_OPENAI_RESOURCE_NAME} deploymentName: ${AZURE_OPENAI_DEPLOYMENT_NAME}Vercel AI SDK
Section titled “Vercel AI SDK”export OPENAI_API_KEY="sk-..." # or other provider keysprovider: vercel-aimodel: openai:gpt-4o # format: provider:modelOpenAI-Compatible (Groq, Together, Ollama, etc.)
Section titled “OpenAI-Compatible (Groq, Together, Ollama, etc.)”export GROQ_API_KEY="gsk_..." # or other provider keyprovider: openaimodel: llama-3.3-70b-versatile
providers: openai: apiKey: ${GROQ_API_KEY} baseUrl: https://api.groq.com/openai/v1See OpenAI-Compatible Providers for Groq, Together AI, Fireworks, OpenRouter, Ollama, LM Studio, and vLLM setup guides.
Feature Comparison
Section titled “Feature Comparison”| Feature | OpenAI | Azure | Anthropic | Vercel AI |
|---|---|---|---|---|
| Streaming | Yes | Yes | Yes | Yes |
| Function Calling | Yes | Yes | Yes | Varies |
| Vision | Yes | Yes | Yes | Varies |
| JSON Mode | Yes | Yes | No | Varies |
| Max Context | 128K | 128K | 200K | Varies |
Configuration Precedence
Section titled “Configuration Precedence”Settings are applied in this order (highest to lowest priority):
- CLI flags -
--provider,--model - Scenario file -
provider:,model: - Config file -
artemis.config.yaml - Environment variables
- Defaults