# OCIModelDeploymentVLLM

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/llms/oci_data_science_model_deployment_endpoint/OCIModelDeploymentVLLM)

VLLM deployed on OCI Data Science Model Deployment

To use, you must provide the model HTTP endpoint from your deployed
model, e.g. https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict.

To authenticate, `oracle-ads` has been used to automatically load
credentials: https://accelerated-data-science.readthedocs.io/en/latest/user_guide/cli/authentication.html

Make sure to have the required policies to access the OCI Data
Science Model Deployment endpoint. See:
https://docs.oracle.com/en-us/iaas/data-science/using/model-dep-policies-auth.htm#model_dep_policies_auth__predict-endpoint

## Signature

```python
OCIModelDeploymentVLLM()
```

## Description

**Example:**

.. code-block:: python

from langchain_community.llms import OCIModelDeploymentVLLM

llm = OCIModelDeploymentVLLM(
    endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
    model="odsc-llm",
    streaming=False,
    temperature=0.2,
    max_tokens=512,
    n=3,
    best_of=3,
    # other model parameters
)

## Extends

- `OCIModelDeploymentLLM`

## Properties

- `n`
- `k`
- `frequency_penalty`
- `presence_penalty`
- `use_beam_search`
- `ignore_eos`
- `logprobs`

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/a6a6079511ac8a5c1293337f88096b8641562e77/libs/community/langchain_community/llms/oci_data_science_model_deployment_endpoint.py#L894)