# OCIGenAI

> **Class** in `langchain_community`

📖 [View in docs](https://reference.langchain.com/python/langchain-community/llms/oci_generative_ai/OCIGenAI)

OCI large language models.

To authenticate, the OCI client uses the methods described in
https://docs.oracle.com/en-us/iaas/Content/API/Concepts/sdk_authentication_methods.htm

The authentifcation method is passed through auth_type and should be one of:
API_KEY (default), SECURITY_TOKEN, INSTANCE_PRINCIPAL, RESOURCE_PRINCIPAL

Make sure you have the required policies (profile/roles) to
access the OCI Generative AI service.
If a specific config profile is used, you must pass
the name of the profile (from ~/.oci/config) through auth_profile.
If a specific config file location is used, you must pass
the file location where profile name configs present
through auth_file_location

To use, you must provide the compartment id
along with the endpoint url, and model id
as named parameters to the constructor.

## Signature

```python
OCIGenAI()
```

## Description

**Example:**

.. code-block:: python

from langchain_community.llms import OCIGenAI

llm = OCIGenAI(
        model_id="MY_MODEL_ID",
        service_endpoint="https://inference.generativeai.us-chicago-1.oci.oraclecloud.com",
        compartment_id="MY_OCID"
    )

## Extends

- `LLM`
- `OCIGenAIBase`

## Properties

- `model_config`

---

[View source on GitHub](https://github.com/langchain-ai/langchain-community/blob/a6a6079511ac8a5c1293337f88096b8641562e77/libs/community/langchain_community/llms/oci_generative_ai.py#L221)