LangChain Reference home pageLangChain ReferenceLangChain Reference
  • GitHub
  • Main Docs
Deep Agents
LangChain
LangGraph
Integrations
LangSmith
LangSmith
  • Client
  • Run Trees
  • Traceable
  • Evaluation
  • Schemas
  • Langchain
  • Jest
  • Vitest
  • Wrappers
  • Anonymizer
  • Traceable
  • Jestlike
  • Vercel
  • Anthropic
  • Sandbox
⌘I

LangChain Assistant

Ask a question to get started

Enter to send•Shift+Enter new line

Menu

LangSmith
ClientRun TreesTraceableEvaluationSchemasLangchainJestVitestWrappersAnonymizerTraceableJestlikeVercelAnthropicSandbox
Language
Theme
JavaScriptlangsmithexperimentalvercel
Module●Since v0.4

experimental/vercel

Copy
import { ... } from "langsmith/experimental/vercel";

Functions

function
convertMessageToTracedFormat→ Record<string, unknown>
function
createLangSmithProviderOptions→ Record<string, JSONValue>

Wraps LangSmith config in a way that matches AI SDK provider types.

import { createLangSmithProviderOptions } from "langsmith/experimental/vercel";
import * as ai from "ai";

const lsConfig = createLangSmithProviderOptions<typeof ai.generateText>({
  // Will have appropriate typing
  processInputs: (inputs) => {
    const { messages } = inputs;
    return {
      messages: messages?.map((message) => ({
        ...message,
        content: "REDACTED",
      })),
      prompt: "REDACTED",
    };
  },
});

Note: AI SDK expects only JSON values in an object for provider options, but LangSmith's config may contain non-JSON values. These are not passed to the underlying AI SDK model, so it is safe to cast the typing here.

function
wrapAISDK→ T

Wraps Vercel AI SDK 6 or AI SDK 5 functions with LangSmith tracing capabilities.

Type Aliases

typeAlias
AggregatedDoStreamOutput
typeAlias
WrapAISDKConfig: Partial<Omit<RunTreeConfig, "inputs" | "outputs" | "run_type" | "child_runs" | "parent_run" | "error" | "serialized">> & __type
View source on GitHub