Optionalfields: ChatBedrockConverseInputOptionaladditionalAdditional inference parameters that the model supports, beyond the
base set of inference parameters that the Converse API supports in the inferenceConfig
field. For more information, see the model parameters link below.
The BedrockRuntimeClient to use. It gives ability to override the default client with a custom one, allowing you to pass requestHandler {NodeHttpHandler} parameter in case it is not provided here.
OptionalclientOverrideable configuration options for the BedrockRuntimeClient. Allows customization of client configuration such as requestHandler, etc. Will be ignored if 'client' is provided.
OptionalendpointOverride the default endpoint hostname.
OptionalguardrailConfiguration information for a guardrail that you want to use in the request.
OptionalmaxMax tokens.
Model to use. For example, "anthropic.claude-3-haiku-20240307-v1:0", this is equivalent to the modelId property in the list-foundation-models api. See the below link for a full list of models.
OptionalperformanceModel performance configuration. See https://docs.aws.amazon.com/bedrock/latest/userguide/latency-optimized-inference.html
The AWS region e.g. us-west-2.
Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config
in case it is not provided here.
Whether or not to stream responses
Whether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.
OptionalsupportsWhich types of tool_choice values the model supports.
Inferred if not specified. Inferred as ['auto', 'any', 'tool'] if a 'claude-3' model is used, ['auto', 'any'] if a 'mistral-large' model is used, empty otherwise.
OptionaltemperatureTemperature.
OptionaltopThe percentage of most-likely candidates that the model considers for the next token. For
example, if you choose a value of 0.8 for topP, the model selects from the top 80% of the
probability distribution of tokens that could be next in the sequence.
The default value is the default value for the model that you are using.
For more information, see the inference parameters for foundation models link below.
Replace with any secrets this class passes to super.
See ../../langchain-cohere/src/chat_model.ts for
an example.
OptionalrunManager: anyOptionaloptions: unknownOptionalconfig: anyOptionalconfig: anyStaticlc_
AWS Bedrock Converse chat model integration.
Setup: Install
@langchain/awsand set the following environment variables:Constructor args
Runtime args
Runtime args can be passed as the second argument to any of the base runnable methods
.invoke..stream,.batch, etc. They can also be passed via.withConfig, or the second arg in.bindTools, like shown in the examples below:Examples
Instantiate
Invoking
Streaming Chunks
Aggregate Streamed Chunks
Bind tools
Structured Output
Multimodal
Usage Metadata
Stream Usage Metadata
Response Metadata