Systems | Development | Analytics | API | Testing

Best Practices for Analyzing Logs in Data Pipelines

Analyzing logs in data pipelines is essential for maintaining system performance, troubleshooting errors, and ensuring compliance. Here's what you need to know: Why It Matters: Logs help identify bottlenecks, resolve errors, and optimize performance. They are also critical for audits and compliance. Challenges: High log volume, varying formats, and security risks make analysis complex. Solutions: Standardize log formats with timestamps, log levels, and metadata.

Reshaping Rental Listings - Insights from Anthemos Georgiades, Zumper | The Innovation Blueprint

In the new episode of The Innovation Blueprint Podcast, ORIL sit down with Anthemos Georgiades, CEO of Zumper, the largest privately owned rental platform in North America, with over 76 million annual site visits. He explores Zumper's revolutionary approach to rental listings, the power of AI in real estate, and the critical role of backend infrastructure in driving seamless user experiences.

RBAC vs ABAC: API Security Implications

Securing APIs requires managing who can access resources and under what conditions. Two primary models stand out: Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC). Here's the key takeaway: RBAC assigns permissions based on predefined roles, making it simple to manage in structured environments. ABAC evaluates multiple real-time attributes for dynamic, granular control, ideal for complex or evolving scenarios.

How to Build a Multi-LLM AI Agent with Kong AI Gateway and LangGraph

In the last two parts of this series, we discussed How to Strengthen a ReAct AI Agent with Kong AI Gateway and How to Build a Single-LLM AI Agent with Kong AI Gateway and LangGraph. In this third and final part, we're going to evolve the AI Agent with multiple LLMs and Semantic Routing policies across them. In this blog post, we'll also explore new capabilities introduced in Kong AI Gateway 3.11 that support other GenAI infrastructures.

What is an AI Gateway?

Ever wondered what an AI Gateway is? Think of it as an airport for your AI traffic! We break down how an AI Gateway can: Act as a central access point for different AI models. Provide security for your LLM prompts. Route traffic to the best model for the job. Save on AI costs with features like response caching. Learn the basics of this essential tool that helps manage AI and LLM costs, security, and efficiency.

Kong AI Gateway: Prompt Compression

High token consumption from long prompts can degrade model performance and lead to expensive, inefficient LLM operations. This video demonstrates how to solve that problem using Kong's AI Gateway. AI Prompt Compressor Plugin: See how this plugin intelligently compresses incoming prompts before they hit the model. It summarizes context, removes redundant information, and trims excess tokens—all while preserving the original meaning.This could lead to significant cost savings and improved performance.

Bitrise maintains SOC 2 Type II compliance with latest successful assessment

At Bitrise, we continually invest in security best practices to ensure that our customer’s data stays safe and secure. As a part of an on-going effort, we are excited to announce that we’ve successfully completed our SOC 2 report. The examination was conducted by A-LIGN, a technology-enabled security and compliance firm trusted by more than 4,000 global organizations to help mitigate cybersecurity risks.

Unit Testing in NestJS for Node Using Suites (Formerly Automock)

For years, Automock was a popular framework for defining mocks and stubs in backend test environments. As technology has evolved, new methods and techniques for streamlining the simulation of dependencies in testing have emerged. That's why Automock has been succeeded by Suites, a more modern and robust library. In this article, we'll explore the transition from Automock to Suites, understand what Suites offers, and see it in action in NestJS through a complete example.