# SagemakerEndpoint

> **Class** in `langchain_aws`

📖 [View in docs](https://reference.langchain.com/python/langchain-aws/llms/sagemaker_endpoint/SagemakerEndpoint)

Sagemaker Inference Endpoint models.

To use, you must supply the endpoint name from your deployed
Sagemaker model & the region where it is deployed.

To authenticate, the AWS client uses the following methods to
automatically load credentials:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

If a specific credential profile should be used, you must pass
the name of the profile from the `~/.aws/credentials` file that is to be used.

Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.

See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html

## Signature

```python
SagemakerEndpoint()
```

## Extends

- `LLM`

## Properties

- `client`
- `endpoint_name`
- `inference_component_name`
- `region_name`
- `credentials_profile_name`
- `aws_access_key_id`
- `aws_secret_access_key`
- `aws_session_token`
- `config`
- `endpoint_url`
- `content_handler`
- `streaming`
- `model_kwargs`
- `endpoint_kwargs`
- `model_config`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-aws/llms/sagemaker_endpoint/SagemakerEndpoint/validate_environment)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-aws/blob/10d18256d46953e5fc8dca313a2c41eee29c2a80/libs/aws/langchain_aws/llms/sagemaker_endpoint.py#L95)