Langflow Plugin
Integrate Langflow's visual AI workflow builder with FastCMS — build AI-powered features without writing LLM code.
Langflow Plugin
The Langflow plugin integrates Langflow with FastCMS, providing a powerful visual interface for building AI workflows without writing code.
What is Langflow?
Langflow is an open-source visual platform for building AI-powered workflows (138K+ GitHub stars). It provides:
- Visual Drag-and-Drop Builder — Create AI workflows without coding
- 50+ Integrations — OpenAI, Anthropic, Google, vector databases, and more
- Real-time Streaming — Token-by-token LLM output
- Production Ready — Battle-tested in production by thousands of teams
Why Use This Integration?
| Benefit | Description |
|---|---|
| Better UI | Professional visual editor maintained by dedicated team |
| More Features | 50+ integrations vs custom implementation |
| Less Code | ~350 lines vs 1,655 lines for a custom solution |
| Zero Dependencies | No additional Python packages required |
| Community Updates | New features added constantly by the Langflow team |
Installation
Step 1: Run Langflow
Option A: Docker (Recommended)
docker run -d -p 7860:7860 langflowai/langflow:latestOption B: pip
pip install langflow
langflow run --port 7860Option C: Docker Compose with FastCMS
services:
fastcms:
build: .
ports:
- "8000:8000"
environment:
- LANGFLOW_URL=http://langflow:7860
depends_on:
- langflow
langflow:
image: langflowai/langflow:latest
ports:
- "7860:7860"
environment:
- LANGFLOW_AUTO_LOGIN=true
volumes:
- langflow_data:/app/langflow
volumes:
langflow_data:Step 2: Configure FastCMS
Add to your .env file:
LANGFLOW_URL=http://localhost:7860
LANGFLOW_API_KEY=your-api-key # Optional (Langflow 1.5+)
LANGFLOW_EMBED_UI=true # Embed in admin UI (default)Step 3: Verify Connection
- Start FastCMS
- Navigate to
/admin/langflow - Check the connection status indicator
Configuration
| Variable | Description | Default |
|---|---|---|
LANGFLOW_URL | Langflow server URL | http://localhost:7860 |
LANGFLOW_API_KEY | API key for authentication | (empty) |
LANGFLOW_EMBED_UI | Embed Langflow UI in admin | true |
LANGFLOW_PROXY_API | Enable API proxy | true |
LANGFLOW_REQUEST_TIMEOUT | Request timeout in seconds | 300 |
UI Modes
Embedded Mode (LANGFLOW_EMBED_UI=true) — Langflow editor embedded directly via iframe.
Link Mode (LANGFLOW_EMBED_UI=false) — Shows a link to the external Langflow instance with feature overview.
API Reference
All endpoints require FastCMS authentication.
Health Check
GET /api/v1/plugins/langflow/health{
"status": "connected",
"langflow_url": "http://localhost:7860"
}List Flows
GET /api/v1/plugins/langflow/flows
Authorization: Bearer <token>{
"flows": [
{
"id": "flow-uuid",
"name": "My Chat Bot",
"description": "Customer support chatbot"
}
],
"total": 1
}Execute Flow
POST /api/v1/plugins/langflow/flows/{flow_id}/run
Authorization: Bearer <token>
Content-Type: application/json
{
"input_value": "Hello, how can you help me?",
"tweaks": {}
}Execute Flow with Streaming (SSE)
POST /api/v1/plugins/langflow/flows/{flow_id}/run/stream
Authorization: Bearer <token>
Content-Type: application/json
{
"input_value": "Explain quantum computing"
}Response (SSE):
data: {"type": "token", "content": "Quantum"}
data: {"type": "token", "content": " computing"}
...
data: {"type": "end"}Usage Examples
JavaScript (with streaming)
async function runFlowStream(flowId, input, onToken) {
const response = await fetch(
`/api/v1/plugins/langflow/flows/${flowId}/run/stream`,
{
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ input_value: input })
}
);
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { value, done } = await reader.read();
if (done) break;
const lines = decoder.decode(value).split('\n');
for (const line of lines) {
if (line.startsWith('data: ')) {
const data = JSON.parse(line.slice(6));
onToken(data);
}
}
}
}
runFlowStream('flow-id', 'Hello!', (data) => {
if (data.type === 'token') process.stdout.write(data.content);
});Python
import httpx, json
class FastCMSLangflowClient:
def __init__(self, base_url: str, token: str):
self.base_url = base_url
self.headers = {"Authorization": f"Bearer {token}"}
async def run_flow_stream(self, flow_id: str, input_value: str):
async with httpx.AsyncClient() as client:
async with client.stream(
"POST",
f"{self.base_url}/api/v1/plugins/langflow/flows/{flow_id}/run/stream",
headers=self.headers,
json={"input_value": input_value},
timeout=300
) as response:
async for line in response.aiter_lines():
if line.startswith("data: "):
yield json.loads(line[6:])Production Deployment
nginx Configuration
server {
listen 443 ssl;
server_name cms.example.com;
location / {
proxy_pass http://fastcms:8000;
proxy_set_header Host $host;
}
location /langflow/ {
proxy_pass http://langflow:7860/;
proxy_set_header Host $host;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}Security
Authentication flow:
User → FastCMS Auth → Plugin Routes → Langflow API
↓ ↓
JWT Token API Key (x-api-key)Best practices:
- Never expose Langflow directly — Always proxy through FastCMS
- Use API keys — Enable Langflow authentication (v1.5+)
- Limit network access — Langflow accessible only from FastCMS
- Audit flow access — FastCMS logs all plugin API requests
Troubleshooting
| Problem | Solution |
|---|---|
| "Disconnected" status | Verify curl http://localhost:7860/health and check LANGFLOW_URL |
| Empty iframe | Set LANGFLOW_EMBED_UI=false or configure Langflow CORS |
| 401/403 errors | Generate API key in Langflow settings and set LANGFLOW_API_KEY |
| Timeout errors | Increase LANGFLOW_REQUEST_TIMEOUT or use the streaming endpoint |