# AzureChatOpenAI

> **Class** in `@langchain/openai`

📖 [View in docs](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI)

Azure OpenAI chat model integration.

Setup:
Install `@langchain/openai` and set the following environment variables:

```bash
npm install @langchain/openai
export AZURE_OPENAI_API_KEY="your-api-key"
export AZURE_OPENAI_API_DEPLOYMENT_NAME="your-deployment-name"
export AZURE_OPENAI_API_VERSION="your-version"
export AZURE_OPENAI_BASE_PATH="your-base-path"
```

## [Constructor args](https://api.js.langchain.com/classes/langchain_openai.AzureChatOpenAI.html#constructor)

## [Runtime args](https://api.js.langchain.com/interfaces/langchain_openai.ChatOpenAICallOptions.html)

Runtime args can be passed as the second argument to any of the base runnable methods `.invoke`. `.stream`, `.batch`, etc.
They can also be passed via `.withConfig`, or the second arg in `.bindTools`, like shown in the examples below:

```typescript
// When calling `.withConfig`, call options should be passed via the first argument
const llmWithArgsBound = llm.withConfig({
  stop: ["\n"],
  tools: [...],
});

// When calling `.bindTools`, call options should be passed via the second argument
const llmWithTools = llm.bindTools(
  [...],
  {
    tool_choice: "auto",
  }
);
```

## Examples

<details open>
<summary><strong>Instantiate</strong></summary>

```typescript
import { AzureChatOpenAI } from '@langchain/openai';

const llm = new AzureChatOpenAI({
  azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY, // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME
  azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
  azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
  temperature: 0,
  maxTokens: undefined,
  timeout: undefined,
  maxRetries: 2,
  // apiKey: "...",
  // baseUrl: "...",
  // other params...
});
```
</details>

<br />

<details>
<summary><strong>Invoking</strong></summary>

```typescript
const input = `Translate "I love programming" into French.`;

// Models also accept a list of chat messages or a formatted prompt
const result = await llm.invoke(input);
console.log(result);
```

```txt
AIMessage {
  "id": "chatcmpl-9u4Mpu44CbPjwYFkTbeoZgvzB00Tz",
  "content": "J'adore la programmation.",
  "response_metadata": {
    "tokenUsage": {
      "completionTokens": 5,
      "promptTokens": 28,
      "totalTokens": 33
    },
    "finish_reason": "stop",
    "system_fingerprint": "fp_3aa7262c27"
  },
  "usage_metadata": {
    "input_tokens": 28,
    "output_tokens": 5,
    "total_tokens": 33
  }
}
```
</details>

<br />

<details>
<summary><strong>Streaming Chunks</strong></summary>

```typescript
for await (const chunk of await llm.stream(input)) {
  console.log(chunk);
}
```

```txt
AIMessageChunk {
  "id": "chatcmpl-9u4NWB7yUeHCKdLr6jP3HpaOYHTqs",
  "content": ""
}
AIMessageChunk {
  "content": "J"
}
AIMessageChunk {
  "content": "'adore"
}
AIMessageChunk {
  "content": " la"
}
AIMessageChunk {
  "content": " programmation",,
}
AIMessageChunk {
  "content": ".",,
}
AIMessageChunk {
  "content": "",
  "response_metadata": {
    "finish_reason": "stop",
    "system_fingerprint": "fp_c9aa9c0491"
  },
}
AIMessageChunk {
  "content": "",
  "usage_metadata": {
    "input_tokens": 28,
    "output_tokens": 5,
    "total_tokens": 33
  }
}
```
</details>

<br />

<details>
<summary><strong>Aggregate Streamed Chunks</strong></summary>

```typescript
import { AIMessageChunk } from '@langchain/core/messages';
import { concat } from '@langchain/core/utils/stream';

const stream = await llm.stream(input);
let full: AIMessageChunk | undefined;
for await (const chunk of stream) {
  full = !full ? chunk : concat(full, chunk);
}
console.log(full);
```

```txt
AIMessageChunk {
  "id": "chatcmpl-9u4PnX6Fy7OmK46DASy0bH6cxn5Xu",
  "content": "J'adore la programmation.",
  "response_metadata": {
    "prompt": 0,
    "completion": 0,
    "finish_reason": "stop",
  },
  "usage_metadata": {
    "input_tokens": 28,
    "output_tokens": 5,
    "total_tokens": 33
  }
}
```
</details>

<br />

<details>
<summary><strong>Bind tools</strong></summary>

```typescript
import { z } from 'zod';

const GetWeather = {
  name: "GetWeather",
  description: "Get the current weather in a given location",
  schema: z.object({
    location: z.string().describe("The city and state, e.g. San Francisco, CA")
  }),
}

const GetPopulation = {
  name: "GetPopulation",
  description: "Get the current population in a given location",
  schema: z.object({
    location: z.string().describe("The city and state, e.g. San Francisco, CA")
  }),
}

const llmWithTools = llm.bindTools([GetWeather, GetPopulation]);
const aiMsg = await llmWithTools.invoke(
  "Which city is hotter today and which is bigger: LA or NY?"
);
console.log(aiMsg.tool_calls);
```

```txt
[
  {
    name: 'GetWeather',
    args: { location: 'Los Angeles, CA' },
    type: 'tool_call',
    id: 'call_uPU4FiFzoKAtMxfmPnfQL6UK'
  },
  {
    name: 'GetWeather',
    args: { location: 'New York, NY' },
    type: 'tool_call',
    id: 'call_UNkEwuQsHrGYqgDQuH9nPAtX'
  },
  {
    name: 'GetPopulation',
    args: { location: 'Los Angeles, CA' },
    type: 'tool_call',
    id: 'call_kL3OXxaq9OjIKqRTpvjaCH14'
  },
  {
    name: 'GetPopulation',
    args: { location: 'New York, NY' },
    type: 'tool_call',
    id: 'call_s9KQB1UWj45LLGaEnjz0179q'
  }
]
```
</details>

<br />

<details>
<summary><strong>Structured Output</strong></summary>

```typescript
import { z } from 'zod';

const Joke = z.object({
  setup: z.string().describe("The setup of the joke"),
  punchline: z.string().describe("The punchline to the joke"),
  rating: z.number().nullable().describe("How funny the joke is, from 1 to 10")
}).describe('Joke to tell user.');

const structuredLlm = llm.withStructuredOutput(Joke, { name: "Joke" });
const jokeResult = await structuredLlm.invoke("Tell me a joke about cats");
console.log(jokeResult);
```

```txt
{
  setup: 'Why was the cat sitting on the computer?',
  punchline: 'Because it wanted to keep an eye on the mouse!',
  rating: 7
}
```
</details>

<br />

<details>
<summary><strong>JSON Object Response Format</strong></summary>

```typescript
const jsonLlm = llm.withConfig({ response_format: { type: "json_object" } });
const jsonLlmAiMsg = await jsonLlm.invoke(
  "Return a JSON object with key 'randomInts' and a value of 10 random ints in [0-99]"
);
console.log(jsonLlmAiMsg.content);
```

```txt
{
  "randomInts": [23, 87, 45, 12, 78, 34, 56, 90, 11, 67]
}
```
</details>

<br />

<details>
<summary><strong>Multimodal</strong></summary>

```typescript
import { HumanMessage } from '@langchain/core/messages';

const imageUrl = "https://example.com/image.jpg";
const imageData = await fetch(imageUrl).then(res => res.arrayBuffer());
const base64Image = Buffer.from(imageData).toString('base64');

const message = new HumanMessage({
  content: [
    { type: "text", text: "describe the weather in this image" },
    {
      type: "image_url",
      image_url: { url: `data:image/jpeg;base64,${base64Image}` },
    },
  ]
});

const imageDescriptionAiMsg = await llm.invoke([message]);
console.log(imageDescriptionAiMsg.content);
```

```txt
The weather in the image appears to be clear and sunny. The sky is mostly blue with a few scattered white clouds, indicating fair weather. The bright sunlight is casting shadows on the green, grassy hill, suggesting it is a pleasant day with good visibility. There are no signs of rain or stormy conditions.
```
</details>

<br />

<details>
<summary><strong>Usage Metadata</strong></summary>

```typescript
const aiMsgForMetadata = await llm.invoke(input);
console.log(aiMsgForMetadata.usage_metadata);
```

```txt
{ input_tokens: 28, output_tokens: 5, total_tokens: 33 }
```
</details>

<br />

<details>
<summary><strong>Logprobs</strong></summary>

```typescript
const logprobsLlm = new ChatOpenAI({ model: "gpt-4o-mini", logprobs: true });
const aiMsgForLogprobs = await logprobsLlm.invoke(input);
console.log(aiMsgForLogprobs.response_metadata.logprobs);
```

```txt
{
  content: [
    {
      token: 'J',
      logprob: -0.000050616763,
      bytes: [Array],
      top_logprobs: []
    },
    {
      token: "'",
      logprob: -0.01868736,
      bytes: [Array],
      top_logprobs: []
    },
    {
      token: 'ad',
      logprob: -0.0000030545007,
      bytes: [Array],
      top_logprobs: []
    },
    { token: 'ore', logprob: 0, bytes: [Array], top_logprobs: [] },
    {
      token: ' la',
      logprob: -0.515404,
      bytes: [Array],
      top_logprobs: []
    },
    {
      token: ' programm',
      logprob: -0.0000118755715,
      bytes: [Array],
      top_logprobs: []
    },
    { token: 'ation', logprob: 0, bytes: [Array], top_logprobs: [] },
    {
      token: '.',
      logprob: -0.0000037697225,
      bytes: [Array],
      top_logprobs: []
    }
  ],
  refusal: null
}
```
</details>

<br />

<details>
<summary><strong>Response Metadata</strong></summary>

```typescript
const aiMsgForResponseMetadata = await llm.invoke(input);
console.log(aiMsgForResponseMetadata.response_metadata);
```

```txt
{
  tokenUsage: { completionTokens: 5, promptTokens: 28, totalTokens: 33 },
  finish_reason: 'stop',
  system_fingerprint: 'fp_3aa7262c27'
}
```
</details>

## Signature

```javascript
class AzureChatOpenAI
```

## Extends

- `ChatOpenAI<CallOptions>`

## Implements

- `Partial<AzureOpenAIChatInput>`

## Constructors

- [`constructor()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/constructor)

## Properties

- `__includeRawResponse`
- `apiKey`
- `audio`
- `azureADTokenProvider`
- `azureOpenAIApiDeploymentName`
- `azureOpenAIApiInstanceName`
- `azureOpenAIApiKey`
- `azureOpenAIApiVersion`
- `azureOpenAIBasePath`
- `azureOpenAIEndpoint`
- `cache`
- `callbacks`
- `caller`
- `completions`
- `defaultOptions`
- `disableStreaming`
- `fields`
- `frequencyPenalty`
- `lc_kwargs`
- `lc_namespace`
- `lc_runnable`
- `lc_serializable`
- `logitBias`
- `logprobs`
- `maxTokens`
- `metadata`
- `modalities`
- `model`
- `modelKwargs`
- `n`
- `name`
- `organization`
- `outputVersion`
- `ParsedCallOptions`
- `presencePenalty`
- `promptCacheKey`
- `promptCacheRetention`
- `reasoning`
- `responses`
- `service_tier`
- `stop`
- `stopSequences`
- `streaming`
- `streamUsage`
- `supportsStrictToolCalling`
- `tags`
- `temperature`
- `timeout`
- `topLogprobs`
- `topP`
- `user`
- `useResponsesApi`
- `verbose`
- `verbosity`
- `zdrEnabled`
- `callKeys`
- `lc_aliases`
- `lc_attributes`
- `lc_id`
- `lc_secrets`
- `lc_serializable_keys`
- `profile`

## Methods

- [`_addVersion()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_addVersion)
- [`_batchWithConfig()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_batchWithConfig)
- [`_callWithConfig()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_callWithConfig)
- [`_combineCallOptions()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_combineCallOptions)
- [`_convertChatOpenAIToolToCompletionsTool()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_convertChatOpenAIToolToCompletionsTool)
- [`_filterInvocationParamsForTracing()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_filterInvocationParamsForTracing)
- [`_generateCached()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_generateCached)
- [`_getOptionsList()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_getOptionsList)
- [`_getSerializedCacheKeyParametersForCall()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_getSerializedCacheKeyParametersForCall)
- [`_llmType()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_llmType)
- [`_modelType()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_modelType)
- [`_separateRunnableConfigFromCallOptions()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_separateRunnableConfigFromCallOptions)
- [`_separateRunnableConfigFromCallOptionsCompat()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_separateRunnableConfigFromCallOptionsCompat)
- [`_streamIterator()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_streamIterator)
- [`_streamLog()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_streamLog)
- [`_streamResponseChunks()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_streamResponseChunks)
- [`_transformStreamWithConfig()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_transformStreamWithConfig)
- [`_useResponsesApi()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_useResponsesApi)
- [`assign()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/assign)
- [`asTool()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/asTool)
- [`batch()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/batch)
- [`bindTools()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/bindTools)
- [`generate()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/generate)
- [`generatePrompt()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/generatePrompt)
- [`getGraph()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getGraph)
- [`getLsParams()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getLsParams)
- [`getLsParamsWithDefaults()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getLsParamsWithDefaults)
- [`getName()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getName)
- [`getNumTokens()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getNumTokens)
- [`getNumTokensFromMessages()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/getNumTokensFromMessages)
- [`identifyingParams()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/identifyingParams)
- [`invocationParams()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/invocationParams)
- [`invoke()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/invoke)
- [`moderateContent()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/moderateContent)
- [`pick()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/pick)
- [`pipe()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/pipe)
- [`serialize()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/serialize)
- [`stream()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/stream)
- [`streamEvents()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/streamEvents)
- [`streamLog()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/streamLog)
- [`toJSON()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/toJSON)
- [`toJSONNotImplemented()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/toJSONNotImplemented)
- [`transform()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/transform)
- [`withConfig()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/withConfig)
- [`withFallbacks()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/withFallbacks)
- [`withListeners()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/withListeners)
- [`withRetry()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/withRetry)
- [`withStructuredOutput()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/withStructuredOutput)
- [`_convertInputToPromptValue()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/_convertInputToPromptValue)
- [`deserialize()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/deserialize)
- [`isRunnable()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/isRunnable)
- [`lc_name()`](https://reference.langchain.com/javascript/langchain-openai/AzureChatOpenAI/lc_name)

---

[View source on GitHub](https://github.com/langchain-ai/langchainjs/blob/ad153c185b6cf813d4b7695740d9a4453d2cb63f/libs/providers/langchain-openai/src/azure/chat_models/index.ts#L430)