Systems | Development | Analytics | API | Testing

API Summit 2025 Recap: AI Connectivity and the Agentic Era

That’s a wrap on API Summit 2025! At our eighth annual event, the brightest minds in the worlds of APIs and AI gathered in New York City to define the next chapter of digital innovation. We're entering an era where APIs are not just connecting services but connecting intelligence. APIs are the neural pathways of this new AI world, where agents will reason, act, and collaborate through endpoints. At this year's API Summit, we saw how quickly this vision is becoming reality.

StudioAssist + MCP: 6 Hands-On Use Cases Every QA Engineer Should Know

The new StudioAssist Agent Mode turns your AI assistant in Katalon Studio into a connected, context-aware testing partner. It now supports MCP Servers, HTTP-based services that let the agent fetch real-time information and perform actions directly inside your project. Katalon ships with two built-in MCP Servers: You can also add your own HTTP-based MCP Servers to extend StudioAssist’s reach. (Note: authentication support is coming soon.)

QualityKiosk and Katalon Launch Co-Lab: A Joint Innovation Lab Driving the Future of AI-Native Test Automation

We are pleased to announce the launch of the QualityKiosk–Katalon Joint Innovation Lab on October 15th, marking a significant milestone in our shared mission to advance the future of test automation and quality engineering. This strategic collaboration between QualityKiosk Technologies and Katalon reflects our commitment to empowering enterprises with prebuilt, next-generation testing and automation solutions designed to enhance agility, efficiency, and innovation.

How to Test Your AI Apps and Features: A Comprehensive Guide for QA Leaders

Your CEO just announced the company’s AI-first strategy and the product team is shipping AI features faster than ever. Marketing is promising intelligent automation to customers, while the QA team is left wondering how to actually test this stuff. Every QA team is grappling with the same challenge as AI becomes the default solution for everything from customer service to content generation.

AI-Powered Data Modeling: From Concept to Production Warehouse in Days

Key Takeaways Enterprise data teams spend millions on warehouse infrastructure while still designing schemas the way they did in 1995—one entity at a time, one relationship at a time, hoping the model survives its first encounter with production data. The irony runs deep: organizations racing to deploy real-time analytics are bottlenecked by modeling processes that take six to eight weeks before a single pipeline runs. Data warehouses succeed or fail on design.

Metrics That Matter for Agentic Testing

Traditional test metrics like automation %, pass/fail rates, and defect counts don’t reflect the impact of introducing agents into the QA process. This blog explores a new class of KPIs designed to measure how well your virtual test team is performing including Agent Assist Rate, Human Override Rate, Scenario Coverage Delta, and Review Time Saved.

IAM for Agentic AI : Episode 03 - Deep dive into #Asgardeo's Agent Identity Capabilites

In Episode 3 of our *"IAM for Agentic AI"* series, we take a closer look at practical solutions for securing your AI agents with Asgardeo. As promised, Geethika Cooray and Ayesha Dissanayaka return to provide a deep dive into Asgardeo's IAM capabilities specifically designed for AI agents. Ayesha walks through a live demo, showcasing how Asgardeo can securely enable AI capabilities within your existing business systems.

Introducing the Volcano SDK to Build AI Agents in a Few Lines of Code

Today, we're open-sourcing Volcano SDK, a TypeScript SDK for building AI agents that combines LLM reasoning with real-world actions through MCP tools. Why Volcano SDK? One reason: because 9 lines of code are faster to write and easier to manage than 100+. Without Volcano SDK? You'd need 100+ lines handling tool schemas, context management, provider switching, error handling, and HTTP clients. With Volcano SDK: 9 lines. Look how we compress 100+ lines with the following example: That's it.

Introducing New MCP Support Across the Entire Konnect Platform

If you’ve been following Kong, you know that Kong was the first in the API platform space to introduce an enterprise-grade AI Gateway for LLM workloads. Today, we’ve also introduced a new enterprise-grade MCP Gateway to ensure that you can roll out production-ready MCP deployments. But we are focused on more than just the Gateway. Today, we’re excited to announce additional MCP workflow support in the Konnect Developer Portal and a brand new MCP integration solution, the MCP Composer.