# SagemakerEndpoint

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/llms/sagemaker_endpoint/SagemakerEndpoint)

Sagemaker Inference Endpoint models.

To use, you must supply the endpoint name from your deployed
Sagemaker model & the region where it is deployed.

To authenticate, the AWS client uses the following methods to
automatically load credentials:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.

Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.
See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html

## Signature

```python
SagemakerEndpoint()
```

## Extends

- `LLM`

## Properties

- `client`
- `endpoint_name`
- `region_name`
- `credentials_profile_name`
- `content_handler`
- `streaming`
- `model_kwargs`
- `endpoint_kwargs`
- `model_config`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-community/llms/sagemaker_endpoint/SagemakerEndpoint/validate_environment)

## ⚠️ Deprecated

Deprecated since version 0.3.16.

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/4b280287bd55b99b44db2dd849f02d66c89534d5/libs/community/langchain_community/llms/sagemaker_endpoint.py#L128)