Optional
fields: anyAWS Credentials.
If no credentials are provided, the default credentials from @aws-sdk/credential-provider-node
will be used.
Optional
endpointOverride the default endpoint hostname.
A custom fetch function for low-level access to AWS API. Defaults to fetch().
Optional
init: RequestInitOptional
init: RequestInitOptional
maxMax tokens.
Model to use. For example, "amazon.titan-tg1-large", this is equivalent to the modelId property in the list-foundation-models api.
Optional
modelAdditional kwargs to pass to the model.
The AWS region e.g. us-west-2
.
Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config in case it is not provided here.
Optional
stopWhether or not to stream responses
Optional
temperatureTemperature.
Call out to Bedrock service model. Arguments: prompt: The prompt to pass into the model.
Returns: The string generated by the model.
Example: response = model.invoke("Tell me a joke.")
Optional
runManager: anyOptional
options: unknownStatic
lc_
A type of Large Language Model (LLM) that interacts with the Bedrock service. It extends the base
LLM
class and implements theBaseBedrockInput
interface. The class is designed to authenticate and interact with the Bedrock service, which is a part of Amazon Web Services (AWS). It uses AWS credentials for authentication and can be configured with various parameters such as the model to use, the AWS region, and the maximum number of tokens to generate.