AI Flows: Langflow as the Platform's Intelligence Layer

Embedding Langflow as a managed AI service with auto-switching flows, MCP tool integration, and context-aware assistance.

We embedded Langflow as a managed AI service with auto-switching flows, user preferences, MCP tool integration, and BFF-style authentication. A standalone LLM playground became a context-aware platform assistant.


Beyond a chatbot

Every data platform has a chat widget now. Most are thin wrappers around an LLM API that answer generic questions. We wanted something different: an AI layer that knows where you are in the platform and has access to your tools.

When you’re on the Query Editor, the AI helps with SQL. When you’re in the dbt workspace, it helps with model development. When you’re browsing the catalog, it helps explore data. Same chat widget, different capabilities, automatically selected based on context.

Architecture

Langflow as the flow engine

We chose Langflow because it separates the “what the AI can do” (flows) from the “how to talk to it” (our chat UI). Data engineers and AI specialists can design flows visually in the Langflow editor without touching platform code. The platform just runs them.

Three system flows ship by default:

  • Platform Help for general navigation and feature guidance
  • Text2SQL Agent for converting natural language to SQL using catalog metadata
  • dbt Assistant for generating models, explaining lineage, resolving errors

Each flow connects to the platform’s MCP server, giving the LLM access to real tools: run SQL queries, browse catalogs, list tables, read schemas. The AI doesn’t just talk about data. It can actually query it.

Custom MCP component

The DataPlatformMCPTools Langflow component auto-discovers all available MCP tools from the platform’s MCP server via SSE. It wraps each tool as a LangChain StructuredTool, forwarding the user’s JWT for per-user RBAC enforcement. If user A asks “show me the sales data,” the MCP tools respect user A’s permissions. The AI can’t see tables the user can’t see.

BFF-style streaming

The chat communication follows the platform’s BFF pattern. The frontend sends messages to the backend, which proxies them to Langflow with server-side authentication. SSE streaming delivers tokens back through nginx, which has special proxy configuration for long-lived connections:

location /api/platform/v1/flows/ {
proxy_buffering off;
proxy_cache off;
proxy_read_timeout 3600s;
}

The user never directly communicates with Langflow. All requests go through the authenticated backend.

Auto-switching with user preferences

When you navigate between pages, the platform automatically switches to the best flow:

  1. Check user preference for this page category (saved per-user in PostgreSQL)
  2. Fall back to admin default for this category (saved globally)
  3. Fall back to first matching system flow

Page categories map routes to flow types: / maps to query, /dbt/* maps to dbt, /admin/catalogs maps to data, everything else maps to help.

Users can star their preferred flow per page. A star button in the chat dropdown saves the preference. Admins set global defaults from the AI Flows management page.

Flow category persistence

Flow categories were initially stored in-memory. Lost on every server restart. We added two database tables:

  • flow_category_defaults for admin-set global defaults per page category
  • user_flow_preferences for per-user overrides

This required careful state management. The GlobalAIChatContext had an infinite re-render loop because useFlowChat() returned a new object every render, causing cascading dependency invalidation. We stabilized it with refs:

const chatRef = useRef(chat);
chatRef.current = chat;
const stableSwitchFlow = useCallback(
(id: string) => chatRef.current.switchFlow(id), []
);

Langflow 1.8.3 compatibility

Upgrading to Langflow 1.8.3 broke all our flows. The AgentComponent API changed. The embedded component code in our flow JSONs referenced a non-existent model attribute. The fix required:

  1. Manually rebuilding the flows in the Langflow UI
  2. Re-exporting the JSONs with 1.8.3’s built-in Agent component code
  3. Updating the ChatInput/ChatOutput components for the dbt Assistant flow

We also discovered that Langflow renamed /api/v1/folders/ to /api/v1/projects/. A one-line backend fix that took 30 minutes to diagnose.

What we learned

The flow URL was doubled. getAPIBaseUrl() returns /api/platform/v1, and useFlowChat appended /api/platform/v1/flows/{id}/run on top of it. The result: /api/platform/v1/api/platform/v1/flows/... which returned 404. A classic base URL concatenation bug.

Flow icons were rendering as text. The Langflow flow metadata stores icon names as strings like “Braces” and “MessagesSquare”, which are Lucide component names. We needed an ICON_MAP to resolve these to actual React components.

The read-only filesystem caught us. Langflow’s custom components are mounted as :ro in Docker for security. This is correct for production but confused us when trying to debug component loading issues. Changes in the Langflow UI couldn’t persist to the filesystem.


AI Flows integration was built in April 2026. Design specs: docs/superpowers/specs/2026-04-04-text2sql-langflow-chat-design.md and docs/superpowers/specs/2026-04-06-globalchat-redesign-design.md.

All posts