Kong releases Volcano: TypeScript, MCP native SDK for building production-ready AI agents with LLM inference and practical operations

Kong has been open sourced volcanoa TypeScript SDK for natively building multi-step agent workflows across multiple LLM providers Model Context Protocol (MCP) Use of Tools. The release coincides with wider MCP functionality Kong Artificial Intelligence Gateway and Connectorpositioning Volcano as the developer SDK in the control plane managed by MCP.

  • Why choose Volcano SDK? Because 9 lines of code is faster and easier to manage than 100+ lines of code.
  • No Volcano SDK? You need over 100 lines to handle tool mode, context management, provider switching, error handling, and HTTP clients.
  • Using the Volcano SDK: 9 lines.
import { agent, llmOpenAI, llmAnthropic, mcp } from "volcano-ai";


// Setup: two LLMs, two MCP servers
const planner = llmOpenAI({ model: "gpt-5-mini", apiKey: process.env.OPENAI_API_KEY! });
const executor = llmAnthropic({ model: "claude-4.5-sonnet", apiKey: process.env.ANTHROPIC_API_KEY! });
const database = mcp("
const slack = mcp("


// One workflow
await agent({ llm: planner })
 .then({
   prompt: "Analyze last week's sales data",
   mcps: [database]  // Auto-discovers and calls the right tools
 })
 .then({
   llm: executor,  // Switch to Claude
   prompt: "Write an executive summary"
 })
 .then({
   prompt: "Post the summary to #executives",
   mcps: [slack]
 })
 .run();

What do volcanoes offer?

The volcanic exposure is compact, Linkable API.then(...).run()—Switch LLM at each step while passing intermediate context between steps (e.g., use one model for planning and another for execution). It treats MCP as a first-class interface: developers provide Volcano with a list of MCP servers, and the SDK executes Tool discovery and invocation automatically. Production features include Automatic retry, Timeout per step, connection pool For MCP servers, OAuth 2.1 authentication, and open telemetry Tracing/metrics for distributed observability. This project was published in Apache-2.0.

The following is Main features Introduction to Volcano SDK:

  • Linkable API: Build multi-step workflows in a concise way .then(...).run() Pattern; contextual flow between steps
  • MCP native tool usage: Via MCP server; SDK automatically discovers and calls the correct tools at every step.
  • Multi-provider LLM support: Mix models within a workflow (e.g., use one model for planning and another for execution).
  • streaming media Respond to the intermediate and final results of agent interactions.
  • Retries and timeouts Each step is configurable to ensure reliability under real-world failures.
  • hook up (Before/After step) Custom behavior and detection.
  • Type error handling Shows actionable faults during agent execution.
  • Parallel execution, branches and loops Express complex control flows.
  • Observability with OpenTelemetry For tracking and metrics across steps and tool calls.
  • OAuth support and connection pooling For secure and efficient access to MCP servers.

It fits Kong’s MCP architecture?

Kong Connector The platform adds several MCP governance and access layers to complement Volcano’s SDK surface:

  • Artificial Intelligence Gateway Get MCP gateway functionality, e.g. Automatically generated by the server API from Kong management, Centralized OAuth 2.1 For MCP servers, and observability Tools, workflows, and tips In the Konnect dashboard. These provide unified strategies and analysis for MCP analysis.
  • this Konnect Developer Portal can become a MCP server So AI coding tools and agents can Discover APIs, request access and use endpoints Programmatically reduce manual credential workflows and access the API catalog through MCP.
  • Kong’s team also announced MCP composer and MCP runner-up Design, build and operate MCP servers and integrations.

Main points

  • Volcano is an open source typescript SDK for building multi-step AI agents Best-in-class MCP tools to use.
  • SDK provides production functionality—Retry, timeout, connection pool, OAuthand open telemetry Tracking/Metrics – used in MCP workflows.
  • Volcanic composition Multiple LL.M. Schedule/Execution and Autodiscover/Invoke MCP Server/Toolsminimizing custom glue code.
  • Kong pairs the SDK with platform controls: AI gateway/connection Add to MCP server auto-generation, centralized OAuth 2.1 and observability.

Kong’s Volcano SDK is a useful addition to the MCP ecosystem: a TypeScript-first proxy framework that aligns developer workflows with enterprise controls (OAuth 2.1, OpenTelemetry) provided through AI Gateway and Konnect. This pairing closes a common gap in the agent stack—tool discovery, authentication, and observability—without having to invent new interfaces beyond MCP. The design prioritizes protocol-native MCP integration over custom glue, reducing operational bias and closing audit gaps as on-premises agents scale.


Check GitHub repository and technical details. Please feel free to check out our GitHub page for tutorials, code, and notebooks. In addition, welcome to follow us twitter And don’t forget to join our 100k+ ML SubReddit and subscribe our newsletter. wait! Are you using Telegram? Now you can also join us via telegram.


Michal Sutter is a data science professional with a master’s degree in data science from the University of Padua. With a solid foundation in statistical analysis, machine learning, and data engineering, Michal excels at transforming complex data sets into actionable insights.

🙌 FOLLOW MARKTECHPOST: Add us as your go-to source on Google.

You may also like...