Optionalfields: anyAWS Credentials.
If no credentials are provided, the default credentials from @aws-sdk/credential-provider-node will be used.
OptionalendpointOverride the default endpoint hostname.
A custom fetch function for low-level access to AWS API. Defaults to fetch().
Optionalinit: RequestInitOptionalinit: RequestInitOptionalmaxMax tokens.
Model to use. For example, "amazon.titan-tg1-large", this is equivalent to the modelId property in the list-foundation-models api.
OptionalmodelAdditional kwargs to pass to the model.
The AWS region e.g. us-west-2.
Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config in case it is not provided here.
Whether or not to stream responses
OptionaltemperatureTemperature.
Call out to Bedrock service model. Arguments: prompt: The prompt to pass into the model.
Returns: The string generated by the model.
Example: response = model.invoke("Tell me a joke.")
OptionalrunManager: anyOptionaloptions: unknownStaticlc_
A type of Large Language Model (LLM) that interacts with the Bedrock service. It extends the base
LLMclass and implements theBaseBedrockInputinterface. The class is designed to authenticate and interact with the Bedrock service, which is a part of Amazon Web Services (AWS). It uses AWS credentials for authentication and can be configured with various parameters such as the model to use, the AWS region, and the maximum number of tokens to generate.