Getting started
Adastra LLMGW provides a unified API for accessing multiple large language models (LLMs), including Azure OpenAI. This guide helps you get started with sending requests through Adastra LLMGW and configuring your environment.
Prerequisites
Before using any Adastra LLMGW client, ensure you have:
System Requirements
- Python 3.12 or higher,
- Internet access via VPN to reach your LLMGW endpoint,
- Valid API credentials for Adastra LLMGW.
API Access
To use Adastra LLMGW, you need to have an API key. Please contact your system administrator to get one.
Common Python Packages
Depending on which client you choose, you may need to install one or more of these packages:
Supported Client Approaches
Adastra LLMGW supports multiple client approaches, each suited for different use cases. Choose the approach that best fits your application architecture and the specific LLM providers you need to access:
Azure OpenAI
- Best for: Azure-specific deployments with full Azure OpenAI feature support.
- Endpoint:
azure-open-ai - Package required:
openai - Key features: Native Azure OpenAI compatibility, streaming support, enterprise-grade security.
OpenAI
- Best for: Standard OpenAI API compatibility and straightforward integrations.
- Endpoint:
openai - Package required:
openai - Key features: Direct OpenAI API compatibility, simple setup, wide ecosystem support.
AWS Bedrock
- Best for: Accessing Anthropic Claude and other AWS Bedrock models.
- Endpoint:
aws-bedrock - Package required:
boto3 - Key features: Access to Anthropic Claude, Amazon Titan, and other Bedrock models.
Langchain - Azure Chat OpenAI
- Best for: Building complex AI applications that benefit from Langchain’s ecosystem.
- Endpoint: Various (depends on underlying provider) - for example
azure-open-ai - Package required:
langchain,langchain-openai - Key features: Advanced AI application patterns, agent frameworks, tool integration.
Each client guide provides detailed setup instructions, code examples, and information on accessing LLMGW response headers for monitoring, debugging, and cost tracking.
Next Steps
For more detailed information about LLM Gateway, explore these sections:
- Supported Clients - Complete list of available client endpoints,
- Supported Providers - Available LLM providers and types,
- Response Headers - Detailed guide to LLMGW response headers for monitoring, debugging, and cost tracking.