langchain.js
    Preparing search index...

    Module @langchain/openai - v1.0.0-alpha.1

    @langchain/openai

    This package contains the LangChain.js integrations for OpenAI through their SDK.

    npm install @langchain/openai @langchain/core
    

    This package, along with the main LangChain package, depends on @langchain/core. If you are using this package with other LangChain packages, you should make sure that all of the packages depend on the same instance of @langchain/core. You can do so by adding appropriate fields to your project's package.json like this:

    {
    "name": "your-project",
    "version": "0.0.0",
    "dependencies": {
    "@langchain/core": "^0.3.0",
    "@langchain/openai": "^0.0.0"
    },
    "resolutions": {
    "@langchain/core": "^0.3.0"
    },
    "overrides": {
    "@langchain/core": "^0.3.0"
    },
    "pnpm": {
    "overrides": {
    "@langchain/core": "^0.3.0"
    }
    }
    }

    The field you need depends on the package manager you're using, but we recommend adding a field for the common pnpm, npm, and yarn to maximize compatibility.

    This package contains the ChatOpenAI class, which is the recommended way to interface with the OpenAI series of models.

    To use, install the requirements, and configure your environment.

    export OPENAI_API_KEY=your-api-key
    

    Then initialize

    import { ChatOpenAI } from "@langchain/openai";

    const model = new ChatOpenAI({
    apiKey: process.env.OPENAI_API_KEY,
    model: "gpt-4-1106-preview",
    });
    const response = await model.invoke(new HumanMessage("Hello world!"));
    import { ChatOpenAI } from "@langchain/openai";

    const model = new ChatOpenAI({
    apiKey: process.env.OPENAI_API_KEY,
    model: "gpt-4-1106-preview",
    });
    const response = await model.stream(new HumanMessage("Hello world!"));

    This package also adds support for OpenAI's embeddings model.

    import { OpenAIEmbeddings } from "@langchain/openai";

    const embeddings = new OpenAIEmbeddings({
    apiKey: process.env.OPENAI_API_KEY,
    });
    const res = await embeddings.embedQuery("Hello world");

    To develop the OpenAI package, you'll need to follow these instructions:

    pnpm install
    
    pnpm build
    

    Or from the repo root:

    pnpm build --filter=@langchain/openai
    

    Test files should live within a tests/ file in the src/ folder. Unit tests should end in .test.ts and integration tests should end in .int.test.ts:

    $ pnpm test
    $ pnpm test:int

    Run the linter & formatter to ensure your code is up to standard:

    pnpm lint && pnpm format
    

    If you add a new file to be exported, either import & re-export from src/index.ts, or add it to the exports field in the package.json file and run pnpm build to generate the new entrypoint.

    Classes

    AzureChatOpenAI
    AzureOpenAI
    AzureOpenAIEmbeddings
    ChatOpenAI
    DallEAPIWrapper
    OpenAI
    OpenAIEmbeddings

    Interfaces

    AzureOpenAIChatInput
    AzureOpenAIInput
    BaseChatOpenAICallOptions
    BaseChatOpenAIFields
    ChatOpenAICompletionsCallOptions
    ChatOpenAIFields
    ChatOpenAIResponsesCallOptions
    DallEAPIWrapperParams
    OpenAIBaseInput
    OpenAICallOptions
    OpenAIChatInput
    OpenAIEmbeddingsParams
    OpenAIEndpointConfig
    OpenAIInput

    Type Aliases

    ChatOpenAICallOptions
    ChatOpenAIReasoningSummary
    ChatOpenAIResponseFormat
    HeadersLike
    OpenAIChatModelId
    OpenAICoreRequestOptions
    OpenAIEmbeddingModelId
    OpenAIImageModelId
    OpenAIVerbosityParam
    ResponseFormatConfiguration

    Functions

    _convertMessagesToOpenAIParams
    convertPromptToOpenAI
    customTool
    getEndpoint
    isHeaders
    messageToOpenAIRole
    normalizeHeaders
    wrapOpenAIClientError