# SagemakerEndpointEmbeddings

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/embeddings/sagemaker_endpoint/SagemakerEndpointEmbeddings)

Custom Sagemaker Inference Endpoints.

To use, you must supply the endpoint name from your deployed
Sagemaker model & the region where it is deployed.

To authenticate, the AWS client uses the following methods to
automatically load credentials:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.

Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.
See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html

## Signature

```python
SagemakerEndpointEmbeddings()
```

## Extends

- `BaseModel`
- `Embeddings`

## Properties

- `client`
- `endpoint_name`
- `region_name`
- `credentials_profile_name`
- `content_handler`
- `model_kwargs`
- `endpoint_kwargs`
- `model_config`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-community/embeddings/sagemaker_endpoint/SagemakerEndpointEmbeddings/validate_environment)
- [`embed_documents()`](https://reference.langchain.com/python/langchain-community/embeddings/sagemaker_endpoint/SagemakerEndpointEmbeddings/embed_documents)
- [`embed_query()`](https://reference.langchain.com/python/langchain-community/embeddings/sagemaker_endpoint/SagemakerEndpointEmbeddings/embed_query)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/4b280287bd55b99b44db2dd849f02d66c89534d5/libs/community/langchain_community/embeddings/sagemaker_endpoint.py#L14)