# ChatSagemakerEndpoint

> **Class** in `langchain_aws`

📖 [View in docs](https://reference.langchain.com/python/langchain-aws/chat_models/sagemaker_endpoint/ChatSagemakerEndpoint)

A chat model that uses a HuggingFace TGI compatible SageMaker Endpoint.

To use, you must supply the endpoint name from your deployed
Sagemaker model & the region where it is deployed.

To authenticate, the AWS client uses the following methods to
automatically load credentials:
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html

If a specific credential profile should be used, you must pass
the name of the profile from the ~/.aws/credentials file that is to be used.

Make sure the credentials / roles used have the required policies to
access the Sagemaker endpoint.
See: https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies.html

## Signature

```python
ChatSagemakerEndpoint()
```

## Extends

- `BaseChatModel`

## Properties

- `client`
- `endpoint_name`
- `inference_component_name`
- `region_name`
- `credentials_profile_name`
- `aws_access_key_id`
- `aws_secret_access_key`
- `aws_session_token`
- `endpoint_url`
- `config`
- `content_handler`
- `streaming`
- `model_kwargs`
- `endpoint_kwargs`
- `model_config`
- `lc_attributes`

## Methods

- [`validate_environment()`](https://reference.langchain.com/python/langchain-aws/chat_models/sagemaker_endpoint/ChatSagemakerEndpoint/validate_environment)
- [`get_lc_namespace()`](https://reference.langchain.com/python/langchain-aws/chat_models/sagemaker_endpoint/ChatSagemakerEndpoint/get_lc_namespace)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-aws/blob/6f2c135c815a3469f42b20321f585143dacbb889/libs/aws/langchain_aws/chat_models/sagemaker_endpoint.py#L111)