Announcing Kong’s New Open Source AI Gateway

9 min read Original article ↗

Product Releases

February 15, 2024

6 min read

Today I’m excited to announce that Kong has released six new open source AI plugins in Kong Gateway 3.6 that turn every Kong Gateway deployment into an AI Gateway. These new plugins are available today and are entirely free and open source for everyone.

The six new plugins are AI Proxy, AI Request/Response Transformer, AI Prompt Guard, AI Prompt Template, and AI Prompt Decorator.


By simply upgrading your current Kong Gateway to version 3.6 (announced today) you’ll be able to use these new plugins entirely focused on AI and LLM usage. This allows developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster, while offering a solution to architects and platform teams that ensures visibility, control, and compliance on every AI request sent by the teams. And because it’s built on top of Kong Gateway, it will be possible to orchestrate AI flows in the cloud or on self-hosted LLMs with the best performance and the lowest latency, which are critical in AI-based applications.

All existing 1,000+ Kong Gateway plugins (official and community) are available out of the box on top of your AI traffic — like AuthN/Z, traffic control, rate-limiting, transformations, and more — making Kong's AI Gateway the most capable one in the entire ecosystem. It’s also natively supported by Kong’s cloud platform, Kong Konnect, as well as Kong Gateway Enterprise and Kong Gateway OSS. 

I’ve recorded six short demonstration videos for each one of the new AI plugins, and you can get started for free today.

What can you do with AI Gateway?

With these new AI plugins, you can:

  • Build multi-LLM integrations — The “ai-proxy” plugin allows you to consume multiple LLM implementations — in the cloud or self-hosted — with the same API interface. It ships with native support for OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLaMA, and because we standardize how they can be used, you can also easily switch between LLMs at the “flip of a switch” without having to change your application code. This is great for using multiple specialized models in your applications and for prototyping.
  • Manage AI credentials centrally — With “ai-proxy” you can also store all AI credentials, including tokens and API keys, in Kong Gateway without having to store them in your applications, so you can easily update and rotate them on the fly and in one centralized place without having to update your code.
  • Collect L7 AI metrics Using the new “ai-proxy” plugin you can now collect L7 analytics — like the number of request and response tokens or the LLM providers and models used — into any third party like Datadog, New Relic, or any logging plugin that Kong Gateway already supports (such as TCP, Syslog, and Prometheus). By doing this, you not only simplify monitoring of AI traffic by the applications, but you also get insights as to what are the most common LLM technologies that the developers are using in the organization. The L7 AI observability metrics are in addition to all other request and response metrics already collected by Kong Gateway.
  • No-code AI integrations You can leverage the benefits of AI without having to write any line of code in your applications by using the new “ai-request-transformer” and “ai-response-transformer” plugins that will intercept every API request and response and augment it with any AI prompt that you have configured. For example, you can translate on the fly an existing API response for internationalization without actually having to change the API or change your client applications. You can enrich, transform, and convert all existing API traffic without lifting a finger, and much more. The possibilities are endless. You can even enrich an AI request directed to a specific LLM provider with another LLM provider that will instantly update the request before sending it to the final destination.
  • Advanced AI prompt engineering We’re shipping three new plugins fully focused on advanced prompt engineering to fundamentally simplify and improve how you’re using AI in your applications. With “ai-prompt-template” we’re introducing an easy way to create prompt templates that can be managed centrally in Kong Gateway and used on the fly by applications by only sending the named values to use in your templates. This way you can update your prompts at a later time without having to update your applications, or even enforce a compliance process for adding a new approved template to the system.
  • Decorate your AI prompts Most AI prompts that you generate also set a context for what the AI should or should not do and specify rules for interpreting requests and responses. Instead of setting up the context every time, you can centrally configure it using the “ai-prompt-decorator” plugin that will prepend or append your context on the fly on every AI request. This is also useful to ensure compliance in the organization by instructing AI to never discuss — for example — restrictive topics and more.
  • AI prompt firewall This capability is more oriented towards teams and organizations that want to ensure that prompts are approved and that someone doesn’t mistakenly use the wrong prompts in their applications. With the “ai-prompt-guard” plugin we can set a list of rules to deny or allow free-form prompts that are being generated by the applications and that are being received by Kong Gateway before they’re sent to the LLM providers.
  • Create an AI egress with 1,000+ features — By leveraging these new AI capabilities in Kong Gateway you can centralize how you manage, secure, and observe all your AI traffic from one place. You can also use all the existing 1,000+ official and community plugins of Kong Gateway to further secure how your AI egress should be accessed by other developers (AuthN/Z, mTLS, etc.), add rate-limiting, introduce consumption tiers, or develop sophisticated traffic control rules between one AI provider to another. 

Every AI egress in Kong Gateway is simply a service like any other, so all Kong Gateway features and plugins are available on day one. This makes Kong’s AI Gateway the most capable one in the entire AI ecosystem. Kong Konnect’s developer portal and service catalog are also supported.

With Kong AI Gateway, we can address cross-cutting capabilities that all teams will need to otherwise build themselves.

What’s next

At Kong, we specialize in providing modern infrastructure for all API use cases, and the most recent driver to API usage in the world has been AI. Over the past few months, we’ve worked with select Kong Gateway customers and users to cover their most common AI use cases as we’ve prepared to release the plugins we’re announcing today. 

More AI capabilities will be shipped in the future, and I’m looking forward to hearing your feedback.

Get started with AI Gateway

You can get started with the new AI Gateway today by reading the getting started guide. You can run them on your standalone Kong Gateway installation or on Kong Konnect, our unified API management platform.

Watch: Adopt AI and Multi-LLM Strategies in a Secure and Governable Way with Kong

Want to learn more about Kong AI Gateway? Join us to discuss the intersection of API management and artificial intelligence (AI), and see how Kong addresses the challenges organizations face in adopting AI.

With Kong AI Gateway, we can address cross-cutting capabilities that all teams will need to otherwise build themselves.

What’s next

At Kong, we specialize in providing modern infrastructure for all API use cases, and the most recent driver to API usage in the world has been AI. Over the past few months, we’ve worked with select Kong Gateway customers and users to cover their most common AI use cases as we’ve prepared to release the plugins we’re announcing today. 

More AI capabilities will be shipped in the future, and I’m looking forward to hearing your feedback.

Get started with AI Gateway

You can get started with the new AI Gateway today by reading the getting started guide. You can run them on your standalone Kong Gateway installation or on Kong Konnect, our unified API management platform.

Watch: Adopt AI and Multi-LLM Strategies in a Secure and Governable Way with Kong

Want to learn more about Kong AI Gateway? Join us to discuss the intersection of API management and artificial intelligence (AI), and see how Kong addresses the challenges organizations face in adopting AI.

Marco Palladino

CTO and Co-Founder of Kong

Recommended posts

Introducing Kong’s Unified API Lifecycle Management Platform

Kong Logo

Kong Konnect now offers a single pane of glass to centrally manage services from API gateway, Ingress controllers, and service mesh We’re excited to announce Kong's unified API lifecycle management platform. Kong Konnect, our cloud API platform, can

A UI Comes to OSS! Introducing Kong Manager Open Source

Kong Logo

At Kong, we love APIs! They’re the foundation of modern architecture, and we believe they’re only going to expand in scope over time. We also support a number of different tools to manage APIs — from  decK  for declarative Kong Gateway configuration

Announcing General Availability of Kong Gateway 3.0

Kong Logo

Kong Gateway 3.0 is an important milestone that introduces the next evolution of our cloud-native API platform . Both the enterprise and open source editions of Kong Gateway 3.0 are available now from your favorite distribution channels . With K

Announcing Insomnia 2021.1

Kong Logo

TL;DR Insomnia Designer and Insomnia Core are now Insomnia. Insomnia Designer users will have to migrate to the new Insomnia application and Designer will no longer receive updates. When we originally built Insomnia Designer, we didn’t want to make

Kong Logo

Nearly four years ago, we open-sourced Kong and made it available to the world. Since then, Kong has been downloaded over 45 million times, been deployed in production at some of the largest companies and government agencies worldwide and attracted

Move More Agentic Workloads to Production with AI Gateway 3.13

Kong Logo

MCP ACLs, Claude Code Support, and New Guardrails New providers, smarter routing, stronger guardrails — because AI infrastructure should be as robust as APIs We know that successful AI connectivity programs often start with an intense focus on how

Introducing the Volcano SDK to Build AI Agents in a Few Lines of Code

Kong Logo

Today, we're open-sourcing Volcano SDK, a TypeScript SDK for building AI agents that combines LLM reasoning with real-world actions through MCP tools. Why Volcano SDK? One reason: because 9 lines of code are faster to write and easier to manage tha