# XinferenceEmbeddings

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/embeddings/xinference/XinferenceEmbeddings)

Xinference embedding models.

To use, you should have the xinference library installed:

.. code-block:: bash

    pip install xinference

If you're simply using the services provided by Xinference, you can utilize the xinference_client package:

.. code-block:: bash

    pip install xinference_client

Check out: https://github.com/xorbitsai/inference
To run, you need to start a Xinference supervisor on one server and Xinference workers on the other servers.

## Signature

```python
XinferenceEmbeddings(
    self,
    server_url: Optional[str] = None,
    model_uid: Optional[str] = None,
)
```

## Description

**Example:**

To start a local instance of Xinference, run

.. code-block:: bash

   $ xinference

You can also deploy Xinference in a distributed cluster. Here are the steps:

Starting the supervisor:

.. code-block:: bash

   $ xinference-supervisor

If you're simply using the services provided by Xinference, you can utilize the xinference_client package:

.. code-block:: bash

    pip install xinference_client

Starting the worker:

.. code-block:: bash

   $ xinference-worker

Then, launch a model using command line interface (CLI).

Example:

.. code-block:: bash

   $ xinference launch -n orca -s 3 -q q4_0

It will return a model UID. Then you can use Xinference Embedding with LangChain.

Example:

.. code-block:: python

    from langchain_community.embeddings import XinferenceEmbeddings

    xinference = XinferenceEmbeddings(
        server_url="http://0.0.0.0:9997",
        model_uid = {model_uid} # replace model_uid with the model UID return from launching the model
    )

## Extends

- `Embeddings`

## Constructors

```python
__init__(
    self,
    server_url: Optional[str] = None,
    model_uid: Optional[str] = None,
)
```

| Name | Type |
|------|------|
| `server_url` | `Optional[str]` |
| `model_uid` | `Optional[str]` |


## Properties

- `client`
- `server_url`
- `model_uid`

## Methods

- [`embed_documents()`](https://reference.langchain.com/python/langchain-community/embeddings/xinference/XinferenceEmbeddings/embed_documents)
- [`embed_query()`](https://reference.langchain.com/python/langchain-community/embeddings/xinference/XinferenceEmbeddings/embed_query)

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/d5ea8358933260ad48dd31f7f8076555c7b4885a/libs/community/langchain_community/embeddings/xinference.py#L8)