AI Core Plugin
Provider abstraction for OpenAI, Anthropic, and Ollama. Foundation for all AI plugins.
AI Core Plugin
The AI Core plugin provides a unified provider interface for LLM completions, embeddings, and streaming. It's the foundation that all other AI plugins depend on.
Installation
Copy the plugin directory into your FastCMS plugins/ folder:
cp -r ai_core/ plugins/ai_core/Restart FastCMS. The plugin will be auto-detected.
Configuration
Configure the provider via the API or admin panel:
# Via API
curl -X POST http://localhost:8000/api/v1/plugins/ai/configure \
-H "Content-Type: application/json" \
-d '{
"provider": "openai",
"api_key": "sk-your-key-here",
"model": "gpt-4o-mini"
}'Or set values in the plugin settings:
curl -X PATCH http://localhost:8000/api/v1/admin/plugins/ai-core/settings \
-H "Content-Type: application/json" \
-d '{
"provider": "openai",
"api_key": "sk-your-key-here",
"model": "gpt-4o-mini"
}'Supported Providers
| Provider | Completions | Embeddings | Streaming | Tool Calling |
|---|---|---|---|---|
| OpenAI | Yes | Yes | Yes | Yes |
| Anthropic | Yes | No (use separate embed provider) | Yes | Yes |
| Ollama | Yes | Yes | Yes | Yes |
OpenAI
{
"provider": "openai",
"api_key": "sk-...",
"model": "gpt-4o-mini"
}Works with any OpenAI-compatible API (e.g., Azure OpenAI, local proxies) by setting base_url.
Anthropic
{
"provider": "anthropic",
"api_key": "sk-ant-...",
"model": "claude-sonnet-4-20250514"
}Anthropic doesn't provide an embedding API. Configure a separate embedding provider:
{
"provider": "anthropic",
"api_key": "sk-ant-...",
"embed_provider": "openai",
"embed_api_key": "sk-...",
"embed_model": "text-embedding-3-small"
}Ollama (Local)
{
"provider": "ollama",
"api_key": "",
"base_url": "http://localhost:11434",
"model": "llama3.1:8b"
}No API key needed. Install Ollama and pull a model:
ollama pull llama3.1:8b
ollama pull nomic-embed-text # for embeddingsAPI Endpoints
All endpoints are mounted at /api/v1/plugins/ai/.
GET /ai/status
Check if the provider is configured.
{
"configured": true,
"available_providers": ["openai", "anthropic", "ollama"],
"provider": "openai",
"model": "gpt-4o-mini"
}POST /ai/generate
Generate a completion.
curl -X POST http://localhost:8000/api/v1/plugins/ai/generate \
-H "Content-Type: application/json" \
-d '{
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is FastCMS?"}
],
"temperature": 0.7,
"max_tokens": 500
}'Response:
{
"content": "FastCMS is an open-source Backend-as-a-Service...",
"model": "gpt-4o-mini",
"usage": {"prompt_tokens": 25, "completion_tokens": 50, "total_tokens": 75}
}POST /ai/generate/stream
Stream a completion as Server-Sent Events.
curl -N http://localhost:8000/api/v1/plugins/ai/generate/stream \
-H "Content-Type: application/json" \
-d '{
"messages": [{"role": "user", "content": "Tell me a story"}]
}'POST /ai/configure
Configure the provider at runtime (no restart needed).
Using in Other Plugins
Other plugins can import from AI Core:
# In your plugin's route handler
from fastcms_plugin.ai_core.providers import get_provider, get_embed_provider
async def my_handler():
provider = get_provider()
result = await provider.complete([
{"role": "user", "content": "Hello"}
])
return result.contentDependencies
None — uses httpx which is already included in FastCMS.