Langchain (Azure Chat OpenAI)
This guide shows how to use Langchain’s Azure ChatOpenAI client with Adastra LLMGW. Langchain provides additional abstractions and tools for building AI applications on top of the base OpenAI functionality.
Client Setup
Install the Langchain OpenAI package, which provides Langchain integrations for OpenAI models:
Set your endpoint to use the Azure OpenAI client through Langchain:
Note: LLMGW_API_ENDPOINT is the same as it was for Azure OpenAI.
Next we need to create the Langchain Azure ChatOpenAI model.
The extra_headers in model_kwargs allows you to associate metadata such as the project name and user with each request, which may be required based on your configuration. Check with your administrator for specific header requirements.
Making Requests
Now let’s make a request using Langchain’s simplified interface:
In this example:
- The
azure_deploymentparameter should matchdeployment_namevalues in your LLMGWconfig.yaml. It refers to model group configured in LLMGW. - Langchain’s
invoke()method provides a simplified interface compared to the raw OpenAI client.
Accessing Response Metadata
For more detailed information, such as request cost and model information, you can inspect the response metadata. Since we set include_response_headers=True, LLMGW includes custom headers prefixed with x-llmgw.
The output may look like this:
x-llmgw-cost- The cost of the request in cents.x-llmgw-request-id- The request id used for the request.x-llmgw-model-id- The model id used for the request.x-llmgw-attempts- The number of attempts made to get the response.
Streaming Responses
Langchain also supports streaming responses for real-time output. You can use the stream() method to get response chunks as they are generated.
This provides the same streaming experience as the raw OpenAI client but through Langchain’s interface.
Advanced Langchain Features
With Langchain, you can leverage additional features like:
- Prompt templates for consistent message formatting,
- Chains for complex multi-step operations,
- Agents for autonomous task execution,
- Memory for conversation persistence.
For more information on these advanced features, see the Langchain documentation.