Debugging and tracing agentic applications remains a challenging task, especially in complex agent systems. Developers and security teams alike struggle to monitor and audit agentic interactions, leading to potential security vulnerabilities and failures in agent performance.
To address this, we are announcing Invariant Gateway, a lightweight, zero-configuration service that acts as an intermediary between AI agents and LLM providers such as OpenAI and Anthropic. By automatically tracing agent interactions and storing them in the Invariant Explorer, Gateway provides valuable insights into agent behavior, without requiring any code changes in your agentic applications. This enables developers and researchers to observe, debug, and optimize their AI workflows using Invariant's agent toolchain, including security scanning, debugging and visualization.
A Non-Invasive, Zero-Configuration Approach
Gateway was designed as a transparent pass-through service for your LLM requests, requiring no other code changes in your agentic applications. Simply update the base URL of your LLM provider to route requests through the Invariant Gateway, and all agentic interactions will be automatically captured and stored in the Invariant Explorer.
Today, Gateway includes support for the following:
- Simple Setup: Simply update the base URL of your LLM provider to route requests through the Invariant Gateway.
- Comprehensive Monitoring: Intercepts LLM-level interactions for enhanced debugging and analysis.
- Support for Tool Use and Computation: Captures all agentic interactions, including function calling and computer use interactions like mouse clicking, screenshots and navigation.
- Seamless Request Forwarding and Streaming: Ensures uninterrupted communication with OpenAI, Anthropic, and other LLM providers, with little to no latency.
- Persistent and Organized Traces: Automatically stores runtime traces in Invariant Explorer, allowing for structured review and analysis.
Adopting Gateway in Your Organization
Gateway can be adopted on an organization level deployment, allowing security teams to monitor and audit all agentic interactions across the organization. This is especially useful for organizations that have multiple teams and projects using different agentic applications, that want to stay on top of security and compliance.
For more information about deploying Gateway on an organization level, please refer to the documentation for the Organization Gateway and reach out to us at [email protected].
Adopting Gateway as a Developer
Gateway can also greatly benefit agent developers, building and debugging their agents, as it allows for seamless integration with Explorer and the rest of the Invariant stack.
To integrate Gateway with e.g. an OpenAI-powered agent, simply update the base URL of your OpenAI client to point to the Invariant Gateway endpoint:
from httpx import Client
from openai import OpenAI
openai_client = OpenAI(
http_client=Client(
headers={
"Invariant-Authorization": "Bearer your-invariant-api-key"
},
),
base_url="https://explorer.invariantlabs.ai/api/v1/gateway/{add-your-dataset-name-here}/openai",
)
result = openai_client.chat.completions.create(
model="gpt-4",
messages=[
{"role": "user", "content": "What is the capital of France?"},
],
)
print("result: ", result)
Note: Remove the curly braces
{}
from the dataset name. If the specified dataset does not already exist in the Invariant Explorer, it will be automatically created upon first use.
With this simple code change, all agentic interactions will be automatically captured and made available to the Invariant stack for analysis and debugging.
Integrations and Deployment
Invariant Gateway seamlessly integrates with leading LLM providers, including OpenAI and Anthropic and Gemini. Additionally, it supports low and no-code integrations with the most popular agent frameworks and agent systems such as Microsoft AutoGen, OpenAI Swarm, OpenHands, SWE Agent and Browser Use. More integrations are continuously being added to expand the capabilities of the Invariant ecosystem.
Open Source: Gateway is open source and available on GitHub. This means Gateway can be publicly audited and improved by the community, ensuring transparency and security.
Local Deployment: Like Explorer, Gateway can also be deployed locally or on-premise. This ensure confidentiality and security for sensitive data. For more information on local deployment, please refer to the documentation for the Self-Hosted Gateway.
Conclusion
Gateway is the next step in our mission to provide a comprehensive platform for secure and reliable agentic applications. By enabling seamless integration with leading LLM providers and agent frameworks, Gateway empowers organizations and developers to build, debug, and optimize their AI workflows with ease.
We invite developers, researchers, and AI enthusiasts to explore the capabilities of the Invariant Gateway combined with the Invariant Explorer. If you have any feedback or questions, reach out to us at [email protected].
To learn more about Gateway see the official documentation and GitHub repository.